pouët.net

Need some tips for 4k demomaking

category: general [glöplog]
4ks using shaders:
http://www.pouet.net/prod.php?which=25850
http://www.pouet.net/prod.php?which=13089
1ks
http://www.pouet.net/prod.php?which=26139
http://www.pouet.net/prod.php?which=18817

So I think 4k and even 1k are enough space for shaders.



added on the 2006-09-05 08:56:48 by auld auld
i tried it all, we usually code our 4k's 100% assembler (with our own - t0a).

If you want shaders, use D3D and D3DX initcode is smaller (the OGL shader init is overkill) + D3DX gives you some nice things (e.g. mesh containers, PRT, Matrix & Math funcs (3D SPLINES!:))).

If you don't need shaders use OpenGL.

And yeah.... coding with com interfaces (D3D) in assembler sucks ;)

and yeah... take a look here:
http://bitfellas.org/e107_plugins/forum/forum_viewtopic.php?5526.last :D

greets las/null ok^metalvotze
added on the 2006-09-05 11:55:06 by las las
and yeah. ;)
added on the 2006-09-05 11:55:35 by las las
auld, still fragment/vertex programs are pretty expensive since opengl doesn't support (yet) offline shader compiling and you have to store the shaders in ascii. Still I don't know if you have some kind of trick for that ...
added on the 2006-09-05 14:57:59 by kanttu kanttu
Shaders are for loosers. Real coders use kilts (man skirts).
added on the 2006-09-05 15:35:55 by xernobyl xernobyl
I dont have tricks for OGL shaders yet (well only obvious stuff like packing attribute data in color commands to avoic importing attribute calls). Nontheless sek managed in C a 1k with vertex and fragment shader. Even without import by ordinal, I saw a version which fits in 1k but loses the noise.

I'd say las might be right about dx though - it might be a better way forward *but* OGL is very possible. Two of the prods above (a little shady is in OGL too) prove it.


added on the 2006-09-05 17:08:19 by auld auld
Quote:

Most examples i found are done with plain text which is parsed at runtime...
it's not possible with ogl to lauch a utility that will parse my ps script, give me a binary i can include in my exe file after?
...
opengl doesn't support (yet) offline shader compiling and you have to store the shaders in ascii.


To clear something up: the GL approach to shaders is to keep the compiler in the openGL implementation, not the SDK. This is so that the shader programs can be optimised for -the particular implementation- that they happen to be running on, whatever it's hardware peculiarities are. It's tougher to optimise when the thing is compiled already, since you've lost a lot of information about what the developer wanted to do.
It's highly unlikely that you'll ever pre-compile your shaders for use in OpenGL.

This is good from the perspective of your demo working on opengl-compatible hardware 10 years from now, but bad from the perspective of saving a few bytes in a 4k.
added on the 2006-09-05 21:43:57 by GbND GbND
There will be a support for offline compiling in OpenGL 2.1

http://www.gamedev.net/columns/events/gdc2006/article.asp?id=233
added on the 2006-09-06 00:03:57 by kanttu kanttu
kanttu: yes, but binary shaders in GLES is not compatible across implementations (and in most cases rendering-cores), so it's more or less useless for 4k. unless you want a one-gpu-only intro.
added on the 2006-09-06 12:58:04 by kusma kusma

login