99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

DDA3020代做、代寫Python語言編程
DDA3020代做、代寫Python語言編程

時間:2024-10-12  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯



DDA3020 Homework 1
Due date: Oct 14, 2024
Instructions
• The deadline is 23:59, Oct 14, 2024.
• The weight of this assignment in the ffnal grade is 20%.
• Electronic submission: Turn in solutions electronically via Blackboard. Be sure to submit
 your homework as one pdf ffle plus two python scripts. Please name your solution ffles as
”DDA3020HW1 studentID name.pdf”, ”HW1 yourID Q1.ipynb” and ”HW1 yourID Q2.ipynb”.
(.py ffles also acceptable)
• Note that late submissions will result in discounted scores: 0-24 hours → 80%, 24-120 hours
→ 50%, 120 or more hours → 0%.
• Answer the questions in English. Otherwise, you’ll lose half of the points.
• Collaboration policy: You need to solve all questions independently and collaboration between
students is NOT allowed.
1 Written Problems (50 points)
1.1. (Learning of Linear Regression, 25 points) Suppose we have training data:
{(x1, y1),(x2, y2), . . . ,(xN , yN )},
where xi ∈ R
d and yi ∈ R
k
, i = 1, 2, . . . , N.
i) (9 pts) Find the closed-form solution of the following problem.
min
W,b
X
N
i=1
∥yi − Wxi − b∥
2
2
,
ii) (8 pts) Show how to use gradient descent to solve the problem. (Please state at least one
possible Stopping Criterion)
1DDA3020 Machine Learning Autumn 2024, CUHKSZ
iii) (8 pts) We further suppose that x1, x2, . . . , xN are drawn from N (µ, σ
2
). Show that the
maximum likelihood estimation (MLE) of σ
2
is σˆ
2
MLE =
1
N
PN
n=1
(xn − µMLE)
2
.
1.2. (Support Vector Machine, 25 points) Given two positive samples x1 = (3, 3)
T
, x2 =
(4, 3)
T
, and one negative sample x3 = (1, 1)
T
, ffnd the maximum-margin separating hyperplane and
support vectors.
Solution steps:
i) Formulating the Optimization Problem (5 pts)
ii) Constructing the Lagrangian (5 pts)
iii) Using KKT Conditions (5 pts)
iv) Solving the Equations (5 pts)
v) Determining the Hyperplane Equation and Support Vectors (5 pts)
2 Programming (50 points)
2.1. (Linear regression, 25 points) We have a labeled dataset D = {(x1, y1),(x2, y2),
· · · ,(xn, yn)}, with xi ∈ R
d being the d-dimensional feature vector of the i-th sample, and yi ∈ R
being real valued target (label).
A linear regression model is give by
fw0,...,wd
(x) = w0 + w1x1 + w2x2 + · · · + wdxd, (1)
where w0 is often called bias and w1, w2, . . . , wd are often called coefffcients.
Now, we want to utilize the dataset D to build a linear model based on linear regression.
We provide a training set Dtrain that includes 2024 labeled samples with 11 features (See linear
 regression train.txt) to fft model, and a test set Dtest that includes 10 unlabeled samples with
11 features (see linear regression test.txt) to estimate model.
1. Using the LinearRegression class from Sklearn package to get the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest by the model trained well. (Put
the estimation of w0, w1, . . . , w11 and these yˆ in your answers.)
2. Implementing the linear regression by yourself to obtain the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest. (Put the estimation of
w0, w1, . . . , w11 and these yˆ in your answers. It is allowed to compute the inverse of a matrix
using the existing python package.)
2DDA3020 Machine Learning Autumn 2024, CUHKSZ
(Hint: Note that for linear regression train.txt, there are 2024 rows with 12 columns where the
ffrst 11 columns are features x and the last column is target y and linear regression test.txt
only contains 10 rows with 11 columns (features). Both of two tasks require the submission of
code and results. Put all the code in a “HW1 yourID Q1.ipynb” Jupyter notebook. ffle.(”.py”
ffle is also acceptable))
2.2. (SVM, 25 points)
Task Description You are asked to write a program that constructs support vector machine
models with different kernel functions and slack variables.
Datasets You are provided with the iris dataset. The data set contains 3 classes of 50 instances
each, where each class refers to a type of iris plant. There are four features: 1. sepal length in cm;
2. sepal width in cm; 3. petal length in cm; 4. petal width in cm. You need to use these features
to classify each iris plant as one of the three possible types.
What you should do You should use the SVM function from python sklearn package, which
provides various forms of SVM functions. For multiclass SVM you should use the one vs rest
strategy. You are recommended to use sklearn.svm.svc() function. You can use numpy for vector
manipulation. For technical report, you should report the results required as mentioned below (e.g.
training error, testing error, and so on).
1. (2 points) Split training set and test set. Split the data into a training set and a test set.
The training set should contain 70% of the samples, while the test set should include 30%.
The number of samples from each category in both the training and test sets should reffect
this 70-30 split; for each category, the ffrst 70% of the samples will form the training set, and
the remaining 30% will form the test set. Ensure that the split maintains the original order
of the data. You should report instance ids in the split training set and test set. The output
format is as follows:
Q2.2.1 Split training set and test set:
Training set: xx
Test set: xx
You should ffll up xx in the template. You should write ids for each set in the same line with
comma separated, e.g. Training set:[1, 4, 19].
2. (10 points) Calculation using Standard SVM Model (Linear Kernel). Employ the
standard SVM model with a linear kernel. Train your SVM on the split training dataset and
3DDA3020 Machine Learning Autumn 2024, CUHKSZ
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, output the weight vector w, the bias b, and the indices of support vectors
(start with 0). Note that the scikit-learn package does not offer a function with hard margin,
so we will simulate this using C = 1e5. You should ffrst print out the total training error
and testing error, where the error is
wrong prediction
number of data
. Then, print out the results for each class
separately (note that you should calculate errors for each class separately in this part). You
should also mention in your report which classes are linear separable with SVM without slack.
The output format is as follows:
Q2.2.2 Calculation using Standard SVM Model:
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
Linear separable classes: xx
If we view the one vs all strategy as combining the multiple different SVM, each one being
a separating hyperplane for one class and the rest of the points, then the w, b and support
vector indices for that class is the corresponding parameters for the SVM separating this class
and the rest of the points. If a variable is of vector form, say a =


1
2
3
?**4;
?**5;?**5;?**6;, then you should write
each entry in the same line with comma separated e.g. [1,2,3].
3. (6 points) Calculation using SVM with Slack Variables (Linear Kernel). For each
C = 0.25 × t, where t = 1, 2, . . . , 4, train your SVM on the training dataset, and subsequently
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, the weight vector w, the bias b, and the indices of support vectors, and the
slack variable ζ of support vectors (you may compute it as max(0, 1 − y · f(X)). The output
format is as follows:
Q2.2.3 Calculation using SVM with Slack Variables (C = 0.25 × t, where t = 1, . . . , 4):
4DDA3020 Machine Learning Autumn 2024, CUHKSZ
-------------------------------------------
C=0.25,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
-------------------------------------------
C=0.5,
<... results for (C=0.5) ...>
-------------------------------------------
C=0.75,
<... results for (C=0.75) ...>
-------------------------------------------
C=1,
<... results for (C=1) ...>
4. (7 points) Calculation using SVM with Kernel Functions. Conduct experiments with
different kernel functions for SVM without slack variable. Calculate the classiffcation error
for both the training and testing datasets, and the indices of support vectors for each kernel
type:
(a) 2nd-order Polynomial Kernel
(b) 3nd-order Polynomial Kernel
(c) Radial Basis Function Kernel with σ = 1
(d) Sigmoidal Kernel with σ = 1
The output format is as follows:
5DDA3020 Machine Learning Autumn 2024, CUHKSZ
Q2.2.4 Calculation using SVM with Kernel Functions:
-------------------------------------------
(a) 2nd-order Polynomial Kernel,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
-------------------------------------------
(b) 3nd-order Polynomial Kernel,
<... results for (b) ...>
-------------------------------------------
(c) Radial Basis Function Kernel with σ = 1,
<... results for (c) ...>
-------------------------------------------
(d) Sigmoidal Kernel with σ = 1,
<... results for (d) ...>
Submission Submit your executable code in a “HW1 yourID Q2.ipynb” Jupyter notebook(”.py”
file is also acceptable). Indicate the corresponding question number in the comment for each cell,
and ensure that your code can logically produce the required results for each question in the required
format. Please note that you need to write clear comments and use appropriate function/variable
names. Excessively unreadable code may result in point deductions.

6

請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp




 

掃一掃在手機打開當前頁
  • 上一篇:代做CS 259、Java/c++設計程序代寫
  • 下一篇:代做MSE 280、代寫Matlab程序語言
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    出評 開團工具
    出評 開團工具
    挖掘機濾芯提升發動機性能
    挖掘機濾芯提升發動機性能
    海信羅馬假日洗衣機亮相AWE  復古美學與現代科技完美結合
    海信羅馬假日洗衣機亮相AWE 復古美學與現代
    合肥機場巴士4號線
    合肥機場巴士4號線
    合肥機場巴士3號線
    合肥機場巴士3號線
    合肥機場巴士2號線
    合肥機場巴士2號線
    合肥機場巴士1號線
    合肥機場巴士1號線
  • 短信驗證碼 豆包 幣安下載 AI生圖 目錄網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

          9000px;">

                国产精选一区二区三区| 国产精品99久久不卡二区| 精品1区2区在线观看| 欧美午夜精品一区二区蜜桃| 国产乱码精品一区二区三| 成人中文字幕电影| 91精品国产品国语在线不卡| 日韩精品一区二区三区在线播放| 日韩欧美一区在线观看| 国产亲近乱来精品视频| 亚洲午夜av在线| 在线观看一区不卡| 亚洲大片在线观看| 欧美日韩国产一区| 亚洲国产aⅴ成人精品无吗| 国产91精品免费| 国产精品久久777777| 精品在线观看免费| 久久久蜜桃精品| 六月丁香婷婷久久| 91成人免费在线视频| 亚洲精品菠萝久久久久久久| 蜜桃一区二区三区四区| 欧美在线看片a免费观看| 亚洲一区二区影院| 日韩一区二区精品| 国产成人在线看| 亚洲激情五月婷婷| 日韩手机在线导航| 99久久免费精品| 五月天精品一区二区三区| 欧美蜜桃一区二区三区| 久久99这里只有精品| 制服丝袜成人动漫| 九九精品视频在线看| 日韩免费观看高清完整版| 亚洲综合色自拍一区| 欧美日韩成人一区二区| 午夜精品久久久久久久99樱桃| 欧美日精品一区视频| 天堂久久久久va久久久久| 69p69国产精品| 成人免费视频app| 美女网站色91| 免费成人你懂的| 婷婷综合五月天| 日韩在线观看一区二区| 亚洲视频1区2区| 亚洲男人的天堂一区二区 | 欧美一区二区黄| 色呦呦日韩精品| 91国产免费看| 成人一区二区视频| 粉嫩aⅴ一区二区三区四区| 午夜精品一区在线观看| 国产丝袜美腿一区二区三区| 粉嫩av亚洲一区二区图片| 亚洲男女一区二区三区| 欧美zozozo| 久久精品一区二区三区不卡 | 欧美日韩在线三区| 色88888久久久久久影院按摩| 精品综合久久久久久8888| 一二三区精品视频| 日韩电影在线一区二区三区| 久久久欧美精品sm网站| 欧美老女人在线| 欧美经典三级视频一区二区三区| 国产精品系列在线| 亚洲视频在线观看一区| 日韩一区国产二区欧美三区| 香蕉成人伊视频在线观看| 天堂蜜桃一区二区三区| 蜜臀久久久99精品久久久久久| 亚洲视频香蕉人妖| 韩国毛片一区二区三区| 成人少妇影院yyyy| 91精品国产色综合久久不卡电影| 7777精品伊人久久久大香线蕉的| 欧美日韩亚洲综合在线 | 不卡的av中国片| 久久久久国产精品厨房| 一区二区三区高清| 91啪亚洲精品| 一区二区三区四区在线| 国产91在线看| 中文幕一区二区三区久久蜜桃| 日本成人中文字幕在线视频| 久久er99热精品一区二区| 欧美日精品一区视频| 亚洲日本在线看| 99久久99久久精品国产片果冻| 日韩一区二区三区电影| 亚洲一二三专区| 91久久人澡人人添人人爽欧美| 国产精品国产自产拍高清av王其| 蜜桃视频在线观看一区二区| 91麻豆123| 久久91精品久久久久久秒播| 91美女在线观看| 日韩不卡手机在线v区| 在线不卡的av| 国产91丝袜在线播放| 中文字幕一区av| 欧美精品粉嫩高潮一区二区| 亚洲综合在线视频| 精品久久人人做人人爽| 成人做爰69片免费看网站| 欧美韩日一区二区三区| 成人av网站大全| 成人app软件下载大全免费| 高清成人免费视频| 日韩欧美国产综合| 亚洲精品一卡二卡| 日韩高清中文字幕一区| 亚洲成人在线免费| 亚洲综合自拍偷拍| 成人黄页毛片网站| 精品视频一区三区九区| 日本韩国欧美在线| 欧美精品一区二区高清在线观看| 国产欧美一区二区三区鸳鸯浴| 粉嫩绯色av一区二区在线观看| 欧美精品第1页| 一本久久a久久精品亚洲| 视频一区二区欧美| 综合激情网...| 亚洲男人电影天堂| 偷拍日韩校园综合在线| 国产精品丝袜在线| 国产女人18毛片水真多成人如厕| 欧美久久久久久蜜桃| 色综合久久久久| 日韩在线一区二区| 麻豆成人免费电影| 97久久超碰国产精品| 成人动漫av在线| 欧美性videosxxxxx| 欧美伦理影视网| 中文字幕一区二区三区在线不卡 | 亚洲免费伊人电影| 毛片一区二区三区| 91免费观看在线| 欧美极品少妇xxxxⅹ高跟鞋| 欧美片网站yy| 一色屋精品亚洲香蕉网站| 亚洲欧美在线视频观看| 自拍偷在线精品自拍偷无码专区| 国产精品美女久久久久av爽李琼| 久久综合久久综合久久| 国产人成亚洲第一网站在线播放| 欧美一区二区在线免费观看| 一本到不卡免费一区二区| 精品视频在线看| 日韩理论电影院| 99久久精品免费看国产免费软件| 欧美日韩国产综合草草| 国产精品国产三级国产普通话蜜臀 | 99麻豆久久久国产精品免费优播| 在线欧美一区二区| 亚洲欧美偷拍另类a∨色屁股| 秋霞成人午夜伦在线观看| 99久久精品国产导航| 欧美日韩国产电影| 一区二区久久久| 欧美日韩精品一区二区三区蜜桃| www激情久久| 韩日av一区二区| 亚洲三级在线播放| 国产欧美一区二区三区网站| 奇米色一区二区三区四区| 91亚洲国产成人精品一区二区三| 日韩欧美中文字幕精品| 丝袜诱惑制服诱惑色一区在线观看| 91在线一区二区| 蓝色福利精品导航| xfplay精品久久| 成人av影视在线观看| 中文字幕高清一区| 91福利在线看| 91在线观看免费视频| 亚洲精品乱码久久久久久| 国产精品一区二区在线看| 久久一夜天堂av一区二区三区| 美女www一区二区| 亚洲桃色在线一区| 国产午夜亚洲精品理论片色戒| 99久久精品久久久久久清纯| 国产精品拍天天在线| 精品视频色一区| 在线免费观看成人短视频| 精品一区二区三区在线观看国产 | 91麻豆swag| 久久久久久日产精品| 久久久久免费观看| 丁香天五香天堂综合| 欧美日韩国产在线观看| 日韩免费电影一区| 日韩综合小视频| 日韩三级免费观看|