I give up with the Three.js Blender. How its rotations work is not documented. Trying ASSIMP.
Articles with the tag: JSON
The Three.js Blender exporter seems to hardcode two bone influences as maximum, so each vertex is influenced by two bones. Also, after setting weights in Blender, but choose "Normalise All" to export properly.
The Three.js exporter script can be altered to increase number of bone influences. Game Engine Architecture says 4 is adequate.
Increase number of bone influences in Three.js Blender exporter:
# skinning MAX_INFLUENCES = 2
It would appear that the Three.js Blender exporter exports indexed vertices. Wonderful.
- Three.js JSON Model format 3 output format best for the demo:
40 = 00101000
40: [vert, vert, vert], [uv, uv, uv], [norm, norm, norm]
Why are there three UV coordinates when UV coordinates are a 2D basis by definition? It's called UVW coordinates and it's a depth value. Blender doesn't appear to export the W value, so it can be ignored.
OpenGL can't handle a separate index for vertices, uvs and normals, so a custom indexing algorithm must be made.
Could use Python to convert JSON into OpenGL-parsable data. It's much nicer to work with for text processing.
It would appear that the ResMan's job is to pull needed resources from the HDD (meshes, materials...) for scene use, and:
- Associates an ID with every scene element and refcounts it. (could use a map for IDs)
- Prevents the SceneNode from containing its own mesh, materials.
- Can do batch loading.
- Can specify default materials for models.
Materials can contain uniform diffuse colour buffer for now, which will help controlling a number of objects at the same time. Can move this control into the ModelNodes if individual object diffuse control (or other constant attributes) is needed.
To use ASSIMP or not? WebGL doesn't have ASSIMP, so maybe good practice to use lower-level parsers.
Can use JSON parser library for C++ and JS.
As part of "Real-Time Global Illumination in a Hybrid Distributed Environment" I looked into the possibility of running games and other interactive animated simulations with global illumination (i.e. physically-based lighting techniques generated using raytracing techniques). The paper compares two different kinds of distributed computing system which would provide the necessary computing power to display raytraced graphics at speeds suitable for gameplay: Shared-memory distributed systems, such as General Computing on Graphics Processing Units (GPGPU), and distributed-memory systems, such as Beowulf clusters and cloud architectures.
I then determined a method in which the two types of distributed system might be combined to create a hybrid distributed system and created an MPICH C++ program to simulate packet distribution.
One interesting research project I covered in the literature review used an OpenGL-based game engine as a base, and determined a way to use OpenGL texture buffers to store raytraced information in order ...
Page 1 / 1