講了很多,最基礎的部分就剩下紋理沒有講到了。texture是directx里面非常重要的一部分。為了簡便起見,我們還是以sdk的tutorial5為例子。
紋理就像一張墻紙,用來貼在物體的表面,當然,如果足夠大,貼一次就能覆蓋整個物體的表面,也可以用適當的方法讓紋理排列成你要的效果。
來看看紋理的比較重要的函數:device.settexture
public void settexture(
int stage, //紋理混合階段序號,從0開始
basetexture texture //要設置的紋理對象
);
public void settexturestagestate(
int stage, //紋理混合階段序號
texturestagestates state, // texturestagestates enumeration的成員
int value //對應階段狀態的值
);
settexturestagestate函數對處理不同的紋理坐標,顏色操作,alpha操作,和凹凸映射/環境映射比較適用,但是這些操作只對dx9的固定功能的多紋理單元有效,不能將他們與像素shader連用。
public void setsamplerstate(
int stage, //紋理混合階段序號
samplerstagestates state, // samplerstagestates enumeration的成員
int value //對應采樣器狀態的值
);
知道了這些下面讀懂這些代碼就很容易了,我們需要建立vertex,這里我們需要有一點點地改變,在以前我們接觸到的vertex里面都不涉及到紋理,所以我們選擇了customvertex里面不包括紋理的類型,現在我們要用customvertex.positionnormaltextured,從名字就可以看出來,這個類型包括了法線還包括了位置的x,y,z,以及紋理坐標的tu和tv。
當然如果使用customvertex.positiontextured 也是可以的,它不包括法線信息。
接下來我們需要為每個vertex指定信息,我們先打斷一下講講紋理坐標,為了通過指定紋理坐標來訪問紋理中的每個圖素,dx采用了一個一般化的編址方案,紋理地址由[0.0,1.0]區間內的坐標組成,這樣我們就不用關心紋理的實際尺寸,例如可以使用(0.0f,0.0f) ,(1.0f,0.0f),(1.0f,1.0f),(0.0f,1.0f)把一個紋理貼到一個矩形上,同樣如果(0.0f,0.0f) ,(0。5f,0.0f),(0.5,1.0f),(0.0f,1.0f)就是紋理的左半邊。
我們可以通過textureloader.fromfile方法來讀入圖片作為紋理。
這里代碼很簡單里面有詳細的注釋,我就不多講了,
//-----------------------------------------------------------------------------
// file: texture.cs
//
// desc: better than just lights and materials, 3d objects look much more
// convincing when texture-mapped. textures can be thought of as a sort
// of wallpaper, that is shrinkwrapped to fit a texture. textures are
// typically loaded from image files, and d3dx provides a utility to
// function to do this for us. like a vertex buffer, textures have
// lock() and unlock() functions to access (read or write) the image
// data. textures have a width, height, miplevel, and pixel format. the
// miplevel is for "mipmapped" textures, an advanced performance-
// enhancing feature which uses lower resolutions of the texture for
// objects in the distance where detail is less noticeable. the pixel
// format determines how the colors are stored in a texel. the most
// common formats are the 16-bit r5g6b5 format (5 bits of red, 6-bits of
// green and 5 bits of blue) and the 32-bit a8r8g8b8 format (8 bits each
// of alpha, red, green, and blue).
//
// textures are associated with geometry through texture coordinates.
// each vertex has one or more sets of texture coordinates, which are
// named tu and tv and range from 0.0 to 1.0. texture coordinates can be
// supplied by the geometry, or can be automatically generated using
// direct3d texture coordinate generation (which is an advanced feature).
//
// copyright (c) microsoft corporation. all rights reserved.
//-----------------------------------------------------------------------------
using system;
using system.drawing;
using system.windows.forms;
using microsoft.directx;
using microsoft.directx.direct3d;
using direct3d=microsoft.directx.direct3d;
namespace texturetutorial
{
public class textures : form
{
// our global variables for this project
device device = null; // our rendering device
vertexbuffer vertexbuffer = null;
texture texture = null;
presentparameters presentparams = new presentparameters();
bool pause = false;
public textures()
{
// set the initial size of our form
this.clientsize = new system.drawing.size(400,300);
// and its caption
this.text = "direct3d tutorial 5 - textures";
}
public bool initializegraphics()
{
try
{
presentparams.windowed=true; // we don't want to run fullscreen
presentparams.swapeffect = swapeffect.discard; // discard the frames
presentparams.enableautodepthstencil = true; // turn on a depth stencil
presentparams.autodepthstencilformat = depthformat.d16; // and the stencil format
device = new device(0, devicetype.hardware, this, createflags.softwarevertexprocessing, presentparams); //create a device
device.devicereset += new system.eventhandler(this.onresetdevice);
this.oncreatedevice(device, null);
this.onresetdevice(device, null);
pause = false;
return true;
}
catch (directxexception)
{
// catch any errors and return a failure
return false;
}
}
public void oncreatedevice(object sender, eventargs e)
{
device dev = (device)sender;
// now create the vb
vertexbuffer = new vertexbuffer(typeof(customvertex.positionnormaltextured), 100, dev, usage.writeonly, customvertex.positionnormaltextured.format, pool.default);
vertexbuffer.created += new system.eventhandler(this.oncreatevertexbuffer);
this.oncreatevertexbuffer(vertexbuffer, null);
}
public void onresetdevice(object sender, eventargs e)
{
device dev = (device)sender;
// turn off culling, so we see the front and back of the triangle
dev.renderstate.cullmode = cull.none;
// turn off d3d lighting
dev.renderstate.lighting = false;
// turn on the zbuffer
dev.renderstate.zbufferenable = true;
// now create our texture
texture = textureloader.fromfile(dev, application.startuppath + @"/../../banana.bmp");
}
public void oncreatevertexbuffer(object sender, eventargs e)
{
vertexbuffer vb = (vertexbuffer)sender;
// create a vertex buffer (100 customervertex)
customvertex.positionnormaltextured[] verts = (customvertex.positionnormaltextured[])vb.lock(0,0); // lock the buffer (which will return our structs)
for (int i = 0; i < 50; i++)
{
// fill up our structs
float theta = (float)(2 * math.pi * i) / 49;
verts[2 * i].position = new vector3((float)math.sin(theta), -1, (float)math.cos(theta));
verts[2 * i].normal = new vector3((float)math.sin(theta), 0, (float)math.cos(theta));
verts[2 * i].tu = ((float)i)/(50-1);
verts[2 * i].tv = 1.0f;
verts[2 * i + 1].position = new vector3((float)math.sin(theta), 1, (float)math.cos(theta));
verts[2 * i + 1].normal = new vector3((float)math.sin(theta), 0, (float)math.cos(theta));
verts[2 * i + 1].tu = ((float)i)/(50-1);
verts[2 * i + 1].tv = 0.0f;
}
// unlock (and copy) the data
vb.unlock();
}
private void setupmatrices()
{
// for our world matrix, we will just rotate the object about the y-axis.
device.transform.world = matrix.rotationaxis(new vector3((float)math.cos(environment.tickcount / 250.0f),1,(float)math.sin(environment.tickcount / 250.0f)), environment.tickcount / 1000.0f );
// set up our view matrix. a view matrix can be defined given an eye point,
// a point to lookat, and a direction for which way is up. here, we set the
// eye five units back along the z-axis and up three units, look at the
// origin, and define "up" to be in the y-direction.
device.transform.view = matrix.lookatlh( new vector3( 0.0f, 3.0f,-5.0f ), new vector3( 0.0f, 0.0f, 0.0f ), new vector3( 0.0f, 1.0f, 0.0f ) );
// for the projection matrix, we set up a perspective transform (which
// transforms geometry from 3d view space to 2d viewport space, with
// a perspective divide making objects smaller in the distance). to build
// a perpsective transform, we need the field of view (1/4 pi is common),
// the aspect ratio, and the near and far clipping planes (which define at
// what distances geometry should be no longer be rendered).
device.transform.projection = matrix.perspectivefovlh( (float)math.pi / 4.0f, 1.0f, 1.0f, 100.0f );
}
private void render()
{
if (pause)
return;
//clear the backbuffer to a blue color
device.clear(clearflags.target | clearflags.zbuffer, system.drawing.color.blue, 1.0f, 0);
//begin the scene
device.beginscene();
// setup the world, view, and projection matrices
setupmatrices();
// setup our texture. using textures introduces the texture stage states,
// which govern how textures get blended together (in the case of multiple
// textures) and lighting information. in this case, we are modulating
// (blending) our texture with the diffuse color of the vertices.
device.settexture(0,texture);
device.texturestate[0].coloroperation = textureoperation.modulate;
device.texturestate[0].colorargument1 = textureargument.texturecolor;
device.texturestate[0].colorargument2 = textureargument.diffuse;
device.texturestate[0].alphaoperation = textureoperation.disable;
device.setstreamsource(0, vertexbuffer, 0);
device.vertexformat = customvertex.positionnormaltextured.format;
device.drawprimitives(primitivetype.trianglestrip, 0, (4*25)-2);
//end the scene
device.endscene();
// update the screen
device.present();
}
protected override void onpaint(system.windows.forms.painteventargs e)
{
this.render(); // render on painting
}
protected override void onkeypress(system.windows.forms.keypresseventargs e)
{
if ((int)(byte)e.keychar == (int)system.windows.forms.keys.escape)
this.dispose(); // esc was pressed
}
protected override void onresize(system.eventargs e)
{
pause = ((this.windowstate == formwindowstate.minimized) || !this.visible);
}
/// <summary>
/// the main entry point for the application.
/// </summary>
static void
main
()
{
using (textures frm = new textures())
{
if (!frm.initializegraphics()) // initialize direct3d
{
messagebox.show("could not initialize direct3d. this tutorial will exit.");
return;
}
frm.show();
// while the form is still valid, render and process messages
while(frm.created)
{
frm.render();
application.doevents();
}
}
}
}
}
這里還有一個簡單的方法處理紋理,其實也差不多,看上去簡單一些而已:
tex
= new texture(device, new bitmap(this.gettype(), "puck.bmp"), usage.dynamic, pool.default);
然后在畫圖的時候用一句
device.settexture(0,
tex
);
就可以把紋理設置到物體上了,不過如果要進行稍微復雜的紋理操作,這個方法就不管用了。
關于紋理的東西還有很多很多,比如紋理的尋址模式,紋理包裝,紋理過濾抗鋸齒以及alpha混合 和多重紋理等等,這里介紹的只是九牛一毛,不過這些在后面都會慢慢介紹到。
by sssa2000