99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

代寫CSE 167、輔導3D OpenGL Rendering

時間:2023-11-21  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯


UCSD CSE 167 Assignment 3:

3D OpenGL Rendering

Figure 1: We will develop an interactive interface for inspecting 3D models in this homework.

As you can probably tell from the previous homeworks, rendering requires computing interactions between

millions of pixels and billions of triangles. This leads to significant challenges in performance, especially when

we want to interact with the content in real-time. To make things really fast, pioneers in computer graphics

came up with the solution to use domain-specific hardware to speedup rendering. Instead of using a general

purpose computer to compute everything, we build chips that specialize at rendering. These processors are

called the Graphics Processing Units (GPUs). The idea of GPUs can be traced back to more than 40 years

ago: The first GPU, Geometry Engine was developed by Jim Clark and Marc Hannah in 1981. Jim Clark

formed the company Silicon Graphics Inc (SGI) in the same year and SGI was one of the most important

computer graphics companies in the history. Nowadays, GPUs are found to be general enough to compute

very wide-range of computation, including deep learning and many scientific computing tasks, and they are

indispensable to the human society. GPU is one of the most successful examples of domain-specific hardware.

In this homework, we will write code to render things using GPUs on your computer. To command your

GPUs, we need to send commands to it using some sort of “Application Programming Interface” (API).

These interfaces are collectively decided by the GPU companies and some other organizations, and each

hardware will come with some “drivers” that actually implement these interfaces using underlying hardware

instructions. The most popular APIs are: OpenGL, DirectX, Metal, Vulkan, and WebGPU. Among these,

DirectX is Windows only, Metal is MacOS only, WebGPU is only for browsers, and Vulkan is extremely

low-level and very verbose for providing fine-grained control (it takes literally a thousand lines to render a

single triangle in Vulkan). Therefore, we will use OpenGL in this homework: even though DirectX, Metal,

and Vulkan are more update to date (the lastest version of OpenGL is 6 years ago), OpenGL is still use

in practice and supported by all major GPUs and OSes, and it is significantly easier to learn compared to

other lower-level APIs. Just like programming languages, it’ll be a lot easier to learn other APIs once you’ve

learned OpenGL.

In this homework, we will mostly follow an online tutorial: learnopengl.com, because they likely write

significantly better tutorials than me. We will implement what we did in the previous homework in OpenGL

and hopefully see significant speedup. We will also create a Graphics User Interface (GUI) and enable

real-time interaction.

This homework is also more “open-ended” compared to the previous ones. We do not ask you to produce

the exact same output as we do. At this point, you should be familiar with the theory of rasterization. We’re

just wrangling with hardware interface, so allowing a bit of creativity seems reasonable.

1

1 Creating a window (10 pts)

Our first task, instead of rendering a single triangle, is to create a window! Read the chapters of OpenGL,

Creating a window, and Hello Window in learnopengl.com to see how to create a window with OpenGL

context using GLFW. Pick your favoriate background color. We have included GLFW and glad in balboa,

so you shouldn’t have to download them. We’re using OpenGL 3.3, but feel free to use the version you like.

Implement your code in hw_3_1 in hw3.cpp. Test it using

./balboa -hw 3_1

Once you are done, take a screenshot of the window you created and save it as outputs/hw_3_1.png.

2 Rendering a single 2D triangle (20 pts)

Yeah, it’s that time again! Read the Hello Triangle chapter and render a single triangle with constant color

(pick one that you like the most). Make sure you’ve become familiar with the ideas of shaders, VAO, VBO,

and EBO. Just to make things slightly different so that we are not just copy and pasting code, let the triangle

rotate in the image plane over time (it can be clockwise or counterclockwise, your choice). For the rotation,

you can do it whichever way you want, but I recommend you do it in the vertex shader. Read the Shaders

chapter and understand how to pass in a uniform variable, then you can use the uniform variable as the

rotation angle.

float vs. double By default, balboa uses double precision floats through the Real type. However, by

default, GLSL uses single precision floats. Be careful of this discrepancy. You can use Vector3f/Matrix3x3f

to switch to float in balboa. Also feel free to use the glm library which is used in the tutorial.

Implement your code in hw_3_2 in hw3.cpp. Test it using

./balboa -hw 3_2

This time, do a screen recording of your rotating triangle and save it as outputs/hw_3_2.mp4 (or whatever

encoding you are using).

3 Rendering 3D triangle meshes with transformations (35 pts)

Next, we’ll use OpenGL to render the type of scenes we handled in the previous homework. Read the

chapters Transformations, Coordinate systems, and cameras, and that should give you enough knowledge to

render the JSON scenes like the ones in the previous homeworks.

This part is a big jump from the previous parts. I would recommend you to do things incrementally. E.g.,

handle two 2D triangles first, add projection matrix, add view matrix, add model matrix, handle multiple

triangle meshes, and finally add camera interaction.

Below are some notes and tips:

Clip space. In Homework 2, our projection matrix convert from camera space directly to the screen space.

In OpenGL, the hardware expects the projection to convert from camera space to the clip space, which by

default ranges from −1 to 1 for x, y, and z axes. Everything outside of the clip space is clipped. Note that

the clipping happens at the far side of z as well – we use the z_far parameter in the camera in our JSON

scene to specify this. The difference in spaces means that we need to use a different projection matrix:

1

as

0 0 0

0

1

s

0 0

0 0 −

zfar

zfar−znear

zfarznear

zfar−znear

0 0 −1 0

?**7;

?**8;

?**8;

?**9;

, (1)

2

where s is the scaling/film size parameter as before, and a is the aspect ratio. The first row and the second

row scale the x and y clipping plane to [−1, 1] respectively. The third row compresses z values from −znear

to −zfar to [−1, 1]. The fourth row is the perspective projection using homogeneous coordinates.

Depth test. By default, OpenGL does not reject triangles when they are occluded. Remember to turn

on depth testing using glEnable(GL_DEPTH_TEST) and clear the Z buffer (e.g., glClear(GL_COLOR_BUFFER_BIT

| GL_DEPTH_BUFFER_BIT)).

Vertex colors. In contrast to the learnopengl tutorial, balboa stores the vertex color in a separate array.

Therefore it’s likely more convienent to create two VBOs:

unsigned int VBO_vertex;

glGenBuffers(1, &VBO_vertex);

glBindBuffer(GL_ARRAY_BUFFER, VBO_vertex);

glBufferData(GL_ARRAY_BUFFER, ...);

glVertexAttribPointer(0 /* layout index */,

3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);

glEnableVertexAttribArray(0);

unsigned int VBO_color;

glGenBuffers(1, &VBO_color);

glBindBuffer(GL_ARRAY_BUFFER, VBO_color);

glBufferData(GL_ARRAY_BUFFER, ...);

glVertexAttribPointer(1 /* layout index */,

3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);

You only need one VAO per mesh regardless.

Multiple meshes. To handle multiple meshes in a scene, create a VAO for each mesh.

Window resizing. We don’t require you to handle window resizing in this homework. It’s annoying

because you’ll need to regenerate the projection matrix every time the aspect ratio changes.

Gamma correction. When we save the image in balboa, we perform a gamma correction by taking a power

of 1

2.2

. OpenGL does not by default do this. To enable gamma correction, use glEnable(GL_FRAMEBUFFER_SRGB).

Read the gamma correction chapter in learnopengl.com to learn more.

Camera interaction. Like the tutorial, you should also implement a simple camera interaction scheme,

see the Camera chapter. A simple WSAD style translation suffices. To obtain the camera direction and

right vector, you can look at the columns of the cam_to_world matrix.

As a bonus (15 pts), add camera rotation based on mouse input like the tutorial. Note that the rotation

in the tutorial assumes a particular camera frame and would not work for our case. I recommend doing the

following: 1) store yaw and pitch angles and the original cam_to_world matrix from the scene. 2) update the

yaw and pitch based on the mouse movement offsets like in the tutorial. 3) form a rotation matrix R based

on yaw and pitch, then form a new cam_to_world matrix by multiplying the original cam_to_world matrix

with R. (Don’t overwrite the original cam_to_world matrix!)

For rotation, it might be tempting to keep only one cam_to_world matrix by keep multiplying it with

new rotation matrices. However, this is going to produce unintuitive behavior (try it!) since yaw and

pitch rotations are not commutative: applying yaw first then pitch will produce different result compared to

applying pitch first then yaw. As a result, when you chain together many pitches and yaws matrix rotations,

they will not represent the desired rotation. Yes, rotation is weird. This is why you should explicitly store

the yaw and pitch angles and modify those instead.

3

Passing parameters in callback functions. If you dislike global variables as much as me, you would

like the functions glfwSetWindowUserPointer and glfwGetWindowUserPointer. You will use it like this:

void mouse_callback(GLFWwindow* window, double xpos, double ypos) {

StructIWanttoPasstoCallback *data_ptr =

glfwGetWindowUserPointer(window);

}

GLFWwindow* window = glfwCreateWindow(width, height, "Balboa", NULL, NULL);

StructIWanttoPasstoCallback data = ...;

glfwSetWindowUserPointer(window, &data);

glfwSetCursorPosCallback(window, mouse_callback);

Debugging. Debugging OpenGL (and other graphics API) programs is painful: if you do one thing wrong,

you’ll likely get a black screen. The learnopengl tutorial provides useful tips for debugging. To debug shaders,

it’s particularly useful to use a debugger such as renderdoc. Unfortunately, none of the existing OpenGL

debuggers work on MacOS anymore (Apple makes it extremely hard to develop OpenGL on MacOS because

they want people to use Metal). For MacOS users, a potential debugging strategy is to emulate the shader

on CPU: write the same code on CPU and print out the values, and see if it does what you expect. It’s going

to be painful regardless, I’m sorry. On the other hand, this is a fruitful research area that awaits innovation

to make things better!

For the 3D transformation, copy your Homework 2 code to the parse_transformation function in hw3_scenes.cpp.

Implement the rest in hw_3_3 in hw3.cpp.

Test your OpenGL rendering using the following commands:

./balboa -hw 3_3 ../scenes/hw3/two_shapes.json

./balboa -hw 3_3 ../scenes/hw3/cube.json

./balboa -hw 3_3 ../scenes/hw3/spheres.json

./balboa -hw 3_3 ../scenes/hw3/teapot.json

./balboa -hw 3_3 ../scenes/hw3/bunny.json

./balboa -hw 3_3 ../scenes/hw3/buddha.json

For two_shapes and cube, they should render to the same images as the previous homework (before you

move the camera yourself). The rest are new scenes. (teapot.json is a higher-resolution version that has 10

times more triangles!) Record a video of you moving the camera for each scene and save them as:

outputs/hw_3_3_two_shapes.mp4

outputs/hw_3_3_cube.mp4

outputs/hw_3_3_spheres.mp4

outputs/hw_3_3_teapot.mp4

outputs/hw_3_3_bunny.mp4

outputs/hw_3_3_buddha.mp4

Acknowledgement. The bunny model was scanned by Greg Turk and Marc Levoy back in 1994 at

Stanford, so it is sometimes called the Stanford bunny. The texture of the bunny model was made by

KickAir_8p who posted the scene in blenderarists.org. The buddha texture was generated by Kun Zhou et

al. for their Texturemontage paper.

Bonus: textures (15 pts). Read the Textures chapter of learnopengl.com and implement textures for

the shapes above. We have provided the UV maps for the models except two_shapes and cube. I have also

included the original textures I used to produce the vertex colors for teapot, bunny, and buddha.

4

4 Lighting (25 pts)

For this part, read the chapters of Colors and Basic Lighting in the tutorial, and implement some basic

lighting in our viewer. Be careful about the transformation of the normals! Use the vertex colors or texture

colors as the objectColor equivalent in the tutorial. Let’s assume ambientStrength=0.1, specularStrength=0.5

and lightDir is at normalize(vec3(1, 1, 1)). Note that you can extract the camera position by looking at

the fourth column of cam_to_world.

The way the tutorial does the lighting requires defining vertex normals (an alternative is to use face

normals, but it often looks uglier). We have provided vertex normals for the following scenes:

./balboa -hw 3_4 ../scenes/hw3/spheres.json

./balboa -hw 3_4 ../scenes/hw3/teapot.json

./balboa -hw 3_4 ../scenes/hw3/bunny.json

./balboa -hw 3_4 ../scenes/hw3/buddha.json

Save your output as screenshots:

outputs/hw_3_4_spheres.png

outputs/hw_3_4_teapot.png

outputs/hw_3_4_bunny.png

outputs/hw_3_4_buddha.png

Bonus: lighting animation (10 pts). Add some animation to the light. Make it move the way you like,

and submit a video recording of the animation.

Bonus: different types of lights (10 pts). Our light currently is a directional light. Implement point

lights and spot lights (see the Light casters chapter) in your renderer, and support multiple lights.

Bonus: shadow mapping (20 pts). Implement a basic shadow map. See the Shadow Mapping chapter

in learnopengl. Support of directional lights is good enough.

5 Design your own scenes (10 pts)

We’re at the fun part again. Design your own scene and render it using your new renderer!

請加QQ:99515681 或郵箱:99515681@qq.com   WX:codehelp

 

掃一掃在手機打開當前頁
  • 上一篇:指標代寫 代寫同花順指標公式
  • 下一篇:指標代寫 代寫同花順指標公式
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    出評 開團工具
    出評 開團工具
    挖掘機濾芯提升發動機性能
    挖掘機濾芯提升發動機性能
    海信羅馬假日洗衣機亮相AWE  復古美學與現代科技完美結合
    海信羅馬假日洗衣機亮相AWE 復古美學與現代
    合肥機場巴士4號線
    合肥機場巴士4號線
    合肥機場巴士3號線
    合肥機場巴士3號線
    合肥機場巴士2號線
    合肥機場巴士2號線
    合肥機場巴士1號線
    合肥機場巴士1號線
  • 短信驗證碼 豆包 幣安下載 AI生圖 目錄網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

          9000px;">

                亚洲成a人片综合在线| **欧美大码日韩| 国内精品伊人久久久久av一坑| 亚洲国产精品二十页| 2欧美一区二区三区在线观看视频| 欧美老肥妇做.爰bbww视频| 在线中文字幕不卡| 色综合网站在线| 91亚洲精品乱码久久久久久蜜桃| av中文字幕不卡| 大尺度一区二区| 日韩高清在线一区| 免费视频最近日韩| 狠狠网亚洲精品| 国产精品一品二品| 国产盗摄一区二区| 91在线观看一区二区| 成人免费视频播放| 91啪亚洲精品| 欧美色窝79yyyycom| 91麻豆精品国产91久久久久久 | 丁香婷婷综合色啪| 波多野结衣中文字幕一区二区三区 | 在线播放日韩导航| 欧美日韩成人综合天天影院| 日韩无一区二区| 久久久久99精品国产片| 亚洲欧洲国产日韩| 一区二区三区电影在线播| 免费国产亚洲视频| 99re这里都是精品| 欧美一区二视频| 国产精品国产三级国产普通话蜜臀| 亚洲精品乱码久久久久| 日本91福利区| 97久久久精品综合88久久| 5858s免费视频成人| 中文字幕乱码久久午夜不卡| 亚洲国产裸拍裸体视频在线观看乱了 | 欧美色男人天堂| 精品动漫一区二区三区在线观看| 中文av一区二区| 日韩精品每日更新| 色综合天天综合网天天看片| 欧美电影免费观看高清完整版在线 | 久久精品噜噜噜成人88aⅴ| 成人18精品视频| 日韩欧美国产系列| 亚洲精品国产高清久久伦理二区| 精品一区二区三区视频在线观看 | 日韩免费视频一区二区| 亚洲欧美一区二区在线观看| 麻豆国产一区二区| 欧美午夜视频网站| 一区免费观看视频| 国内久久精品视频| 在线不卡中文字幕播放| 亚洲精品一卡二卡| 99精品国产91久久久久久| www激情久久| 天天av天天翘天天综合网| 粉嫩久久99精品久久久久久夜| 在线播放国产精品二区一二区四区 | 国产综合一区二区| 日韩午夜在线影院| 亚洲成人自拍一区| 色综合久久久久久久| 亚洲国产精品精华液2区45| 蜜臀久久99精品久久久久宅男| 欧美电影在线免费观看| 亚洲第一会所有码转帖| 91视频在线观看免费| 国产精品久久久久久久久快鸭| 国产一区免费电影| 国产亚洲自拍一区| 国产成人一区在线| 国产日韩欧美综合一区| 国产精品自拍av| 久久久www成人免费无遮挡大片| 蜜桃视频在线一区| 欧美成人三级电影在线| 美脚の诱脚舐め脚责91| 日韩精品专区在线影院重磅| 久久精品国产亚洲高清剧情介绍| 欧美麻豆精品久久久久久| 亚洲自拍偷拍av| 欧美日韩成人综合天天影院| 视频一区视频二区中文| 日韩欧美中文一区二区| 久久精品国产久精国产爱| 欧美一级黄色大片| 久久机这里只有精品| 精品国产成人系列| 国产成人精品免费看| 国产精品另类一区| 欧美在线小视频| 免费在线看成人av| 久久久国产一区二区三区四区小说 | 蜜桃av一区二区| 久久午夜免费电影| 成人动漫一区二区| 亚洲成人自拍网| 精品国产青草久久久久福利| 丰满放荡岳乱妇91ww| 亚洲丝袜精品丝袜在线| 欧美色区777第一页| 国内精品免费**视频| 亚洲欧美偷拍三级| 欧美成人vps| av动漫一区二区| 亚洲v中文字幕| 国产午夜一区二区三区| 色香蕉久久蜜桃| 九色综合狠狠综合久久| 亚洲欧洲综合另类| 欧美大片国产精品| 一本一道综合狠狠老| 麻豆一区二区三| 亚洲色图视频免费播放| 在线91免费看| 不卡一区二区三区四区| 日韩成人精品视频| 国产精品福利在线播放| 日韩欧美在线影院| 91丨porny丨最新| 国内外精品视频| 一区二区三区四区在线播放| 国产亚洲综合性久久久影院| 91久久精品一区二区三| 国产一区视频网站| 亚洲影视在线播放| 欧美激情综合网| xfplay精品久久| 欧美乱妇15p| 一本大道av一区二区在线播放| 蜜桃视频第一区免费观看| 亚洲婷婷国产精品电影人久久| 精品国产91洋老外米糕| 欧美日韩一区久久| 99视频有精品| 国产一区二区剧情av在线| 日韩激情一区二区| 一区二区三区视频在线观看| 国产精品二区一区二区aⅴ污介绍| 日韩欧美国产不卡| 欧美日韩中文字幕一区| 在线精品视频一区二区| www.亚洲激情.com| 岛国av在线一区| 国产精品亚洲第一区在线暖暖韩国| 老司机精品视频导航| 日韩精品电影在线观看| 一区二区日韩电影| 亚洲黄色录像片| 亚洲人123区| 亚洲综合在线第一页| 亚洲精品免费在线播放| 中文字幕综合网| 国产精品二区一区二区aⅴ污介绍| 国产精品久久久爽爽爽麻豆色哟哟| 精品日韩欧美一区二区| 欧美大胆一级视频| 91精品国产黑色紧身裤美女| 91精品一区二区三区久久久久久 | 国产91精品精华液一区二区三区 | 蜜桃视频在线观看一区二区| 日韩电影在线免费| 日本伊人色综合网| 男女视频一区二区| 久久99精品久久久久久| 国产在线不卡一卡二卡三卡四卡| 天天综合日日夜夜精品| 欧美aaaaa成人免费观看视频| 亚洲电影在线播放| 日韩精品一级二级| 美女视频网站久久| 韩国女主播成人在线| 东方aⅴ免费观看久久av| heyzo一本久久综合| 色综合中文字幕| 欧美人与z0zoxxxx视频| 3751色影院一区二区三区| 日韩免费性生活视频播放| 久久综合五月天婷婷伊人| 国产精品久久久久一区| 亚洲精品欧美激情| 免费美女久久99| 国产一区二区按摩在线观看| 97国产精品videossex| 欧美性猛交xxxx乱大交退制版| 91精品欧美久久久久久动漫| 国产午夜精品一区二区三区视频| 成人免费一区二区三区在线观看| 亚洲二区视频在线| 国产一区二区久久| 色视频一区二区| 欧美成人性战久久| 亚洲精选视频免费看| 国产在线精品国自产拍免费| 99久久精品费精品国产一区二区|