Understanding how textures and shaders work in IFC
Recently, @yorgunkirmizi reached out to discuss how textures, shaders, texture coordinates, and all that type of stuff works in IFC. We've made some serious progress deciphering what is and isn't possible, and so I thought I'd open this thread so that others can also get involved or at least keep tabs on the progress.
As far as we can tell, textures is one of the more famous but unsupported aspects of IFC. The FZKViewer has some preliminary texture support, as in, for one of the types of textures, Jakob Beetz has managed to get it to show up on a primitive object. The BlenderBIM Add-on supports external references, which is a bit of a cheat to get arbitrarily complex textures, materials, and lighting, since it defers the texture definition outside IFC. So we want to do better and support it properly, which also includes providing test cases for the rest of the industry so they can catch up too.
For contrast, let's start with a how a representation is styled with a surface colour:
IfcShapeRepresentation --> IfcFacetedBrep <-- IfcStyledItem --> IfcSurfaceStyle --> IfcSurfaceStyleShading
The IfcSurfaceStyleShading
includes basic viewport colours, diffuse component, specular component, and some transmission data. Think of it as your principled shader with some flashbacks to still allow "phong" to be chosen as a shader type.
For textures, instead of using an IfcSurfaceStyleShading
, we use IfcSurfaceStyleWithTextures
instead. This then lets us stack a series of IfcSurfaceTexture
textures. Think of this stack as the various nodes that combine to create your colour mixes, diffuse maps, specular maps, bump maps, and so on.
IfcShapeRepresentation --> IfcFacetedBrep <-- IfcStyledItem --> IfcSurfaceStyle --> IfcSurfaceStyleWithTextures -> IfcSurfaceTexture (L[1:?])
There are three types of IfcSurfaceTexture
: Blobs, Images, and Pixels. These are three different ways to reference your raw image data. Either an embedded blob, a URI pointing to an image, or an array of pixel RGBs. Regardless of which imaging reference you choose, all IfcSurfaceTexture
s share the same ability to combine in the stack with various effects and reference texture coordinates.
In theory, there seems to be enough flexibility with the parameters to record any texture system, like Cycles or Eevee nodes, or establishing your own convention. However, IFC's texturing capabilities were derived from ISO/IES 19775-1.2:2008 X3D Architecture and base components Edition 2, Part 1. So the default is to support how shaders work in X3D. In short: to understand textures in IFC, you first need to understand how shaders work in X3D.
There are some documents you want to read first:
- Read all the IFC docs on the classes mentioned above. Especially IfcSurfaceTexture. You won't fully understand yet how IfcSurfaceTexture works though until you read the rest of the documents in this list.
- https://www.web3d.org/documents/specifications/19775-1/V3.2/ the X3D spec, see clause 18, Texturing Component, then read that entire section, with a focus on the MultiTexture node.
- https://castle-engine.io/x3d_multi_texturing.php after you think you understand how the MultiTexture node works, verify your knowledge of it by reading this page. It's a goldmine of knowledge.
- https://github.com/castle-engine/demo-models/tree/master/multi_texturing - this repository contains sample X3D code of test cases for each of the different shading combinations
- https://castle-engine.io/view3dscene.php - this demo viewer in theory lets you recreate the demo files. The results I got were different from the benchmark screenshot, so I've reached out for help from the author.
- https://github.com/blender/blender-addons/tree/blender-v2.79b-release/io_scene_x3d thanks to @Gorgious for discovering this gem, Blender already has an X3D importer / exporter. However, from what I can tell, they do not support MultiTexture, which is the key bit of the puzzle to understand how textures work in IFC. However, it does cover code for pixel textures and UV maps that we can probably reuse.
@yorgunkirmizi and I have been working through this exercise:
- Understanding the X3D specification and test cases
- Recreating the test cases in Blender, to understand how this seemingly dated standard is reflected in more modern rendering engines. We will create node trees for each of the base cases, describing what is and isn't possible.
- Converting these test cases to IFC test cases, showing how you can recreate both the X3D definitions and the Blender material node trees in IFC.
- Identifying where the existing IFC textures fall short, and whether or not any existing solutions in the X3D world may help.
- Creating typical basic material setups that a designer might use, such as a basic diffuse map, all the maps from a PBR workflow, and showing how these are represented in IFC.
- Clearly documenting where it falls short. Obvious scenarios that come to mind are procedural textures, blackbody colour temperatures, RGB curves, and SSS. Some of these can be supported fairly easily by simply describing new conventions (e.g. just add SSSMAP and done!) to bring it up to more recent standards. For others, we might fall back to external definitions, or define a render engine specific convention.
7 . Write code! Make it happen! Textures! Yeah!
We've just finishing up on step 2, and made a start on step 3.
Comments
We've just caught up to step 6. Before we begin step 7, we will have the benefit of consulting with Michalis Kamburelis, the guru from the open source X3D world where we will present our findings and ask his advice on whether we messed up.
It takes too long to document what we've done so far, but here's a dump that might make some sense.
Here are the node trees of scenario 4 and 5 (which I need to double check some of these but the concept is there):
So I think a number of us now have a pretty clear idea of what you can and can't do with texture node trees in IFC. Even better, we've now actually implemented the existing specification (to a certain point, missing some of the more obscure features) for export. To our knowledge we're the first people not only to implement this in IFC... but also one of the few people who care about it in the X3D world. IFC seemed to have chosen one of the most obscure and buggily implemented aspects of the X3D spec to reference.
However, referencing X3D was a good move in general I feel, because X3D is much more involved in the texturing world than IFC is, so wherever X3D goes, IFC can follow. After talking with Michalis, an expert in X3D, he has confirmed our understanding and current implementation. More interestingly however, he also confirmed the various shortcomings of X3D, and the future direction. It seems as though X3D v4 (IFC references v3.2) is now mirroring the approach of glTF, so they will be compatible. This means that with a few tweaks, perhaps even without changing the current specification, we can make IFC4 compatible with X3D4 and glTF. What a huge achievement that would be! Not only for interoperability, but ensuring that the texturing systems are appropriate for modern artists and their tools.
It's difficult to summarise the full extents of what we've learned, but here's an attempt to put it into terms that most 3D artists would appreciate.
Here are some links Michalis pointed us to for the record:
I hope to put forward a series of proposals hopefully in time for IFC 4.3 which will align IFC, X3D 4, and glTF.
Thank you @Moult for the great conversation! I confirm everything in the above summary :)
Let me add a note summarizing the X3Dv4 approach to textures:
X3D v4 has 2 places to where you can put textures:
You can put them in material fields, like
Material.diffuseTexture
,Material.specularTexture
,PhysicalMaterial.baseTexture
,PhysicalMaterial.metallicRoughnessTexture
. See X3D spec of theXxxMaterial
nodes, they all have a number ofxxxTexture
fields: https://www.web3d.org/specifications/X3Dv4Draft/ISO-IEC19775-1v4-CD/Part01/components/shape.html#MaterialYou can put a texture (possibly
MultiTexture
) inAppearance.texture
. See https://www.web3d.org/specifications/X3Dv4Draft/ISO-IEC19775-1v4-CD/Part01/components/shape.html#Appearance .AD 1 is more modern approach, matching at least glTF and Collada capabilities, and matching how the authoring tools like Blender work. Each texture clearly affects a specific material parameter.
AD 2 is a backward-compatible approach. We don't plan to remove it in any foreseeable future... Though we encourage to use AD 1, at least for simple textures (when you don't need
MultiTexture
).It interacts with approach AD 1 in a clear way: if you use
Appearance.texture
, andMaterial.diffuseTexture
is empty, thenAppearance.texture
acts exactly asMaterial.diffuseTexture
. (There are similar rules forPhysicalMaterial.baseTexture
,UnlitMaterial.emissiveTexture
.)AD 2 is also the (only) gateway to continue using
MultiTexture
in X3Dv4. So you can compose a number of textures to calculate diffuse parameter (for Phong) or base (for PBR).Cheers @michalis ! One thing I'd also like to investigate is whether or not we can align X3D, glTF, and IFC for texture UV coordinates. Is this something X3D and glTF have consensus on? Currently, IFC is actually missing texture UV coordinate support for non-triangulated meshes, so the time is ripe for some proposals whilst they're fixing that aspect.
Further reading on the current IFC UV coordinate shortcomings: https://github.com/buildingSMART/IFC4.3.x-development/issues/135
I've split our findings into three separate proposals to buildingSMART. Read more here:
In case of texture coordinates, X3D has actually a bit more than glTF.
glTF:
allows only 2D texture coordinates, as it only allows 2D textures (see https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#meshes-overview -
TEXCOORD_n
isVEC2
).For each texture, glTF allows to say whether to use coordinates in
TEXCOORD_0
orTEXCOORD_1
(implementations may support more, but glTF spec only requires to support 2 - """Client implementations SHOULD support at least two texture coordinate sets""").X3D:
X3D allows more.
You can have many texture coordinates using
MultiTextureCoordinate
with children (you can specify as many as you like, though client implementations will certainly have some limits as underlying shaders have limits).Texture coordinates may be 2D, 3D, 4D using
TextureCoordinate
,TextureCoordinate3D
,TextureCoordinate4D
, you can also auto-generate them usingTextureCoordinateGenerator
.Each texture (in material) has a field that says which one to use, like
Material.diffuseTextureMapping
,PhysicalMaterial.baseTextureMapping
and it should match themapping
field in one of the texture coordinates. If you useMultiTexture
inAppearance.texture
then it's a just backward compatible, i.e. textures inMultiTexture
take coordinates fromMultiTextureCoordinate
in the same order.X3D can trivially express glTF case: just use mapping names
TEXCOORD_0
,TEXCOORD_1
, in fields likeMaterial.diffuseTextureMapping
,PhysicalMaterial.baseTextureMapping
,TextureCoordinate.mapping
. This allows to express glTF is mostly straightforward way (this is what Castle Game Engine / view3dscene also does under the hood when you open glTF).@michalis thanks! I've got quite a bit to learn. In X3D, we talked about how COORD maps to "Generated" texture coordinates with this definition:
However, does COORD have any effect on the projection method in the Image Texture node? (See link below) Does me choosing "Flat" "Box" "Sphere" or "Tube" have any impact? Is there an equivalent attribute for this projection in X3D? Being able to do Box projection for textures on extruded geometry in IFC is very important so I'd like to confirm the support for it :)
X3D
TextureCoordinateGenerator.mode
="COORD" means that we provide 3D texture coordinates based on mesh vertexes 3D positions. This is like Blender "Flat" mapping, but in 3D (but it only matters if you use 3D textures; for 2D textures, it just means that XY position determines UV).The default texture mapping on X3D
IndexedFaceSet
chooses 2 largest axes, and maps them to U and V texture coordinates. In CGE you can also request it explicitly on any geometry, byTextureCoordinateGenerator.mode
="BOUNDS2D". This does not correspond to any Blender mode immediately. (though you could perhaps construct it using Blender modes, as it's like Blender "Flat" -- but with flexibility which 3D vertex axis corresponds to which 2D texture coordinate axis; I'm sure it's possible to configure Blender nodes to achieve this).The effects of
TextureCoordinateGenerator
and above-mentioned "default texture mapping onIndexedFaceSet
" are recalculated every frame. (Though I recall we had some confusion about it on X3D mailing list a few years ago, and I'm not 100% sure whether this is true for all X3D browsers. But it should be true for Castle Game Engine / view3dscene and X3DOM.) So no X3D automatic texture coordinate mapping achieves the effect """keeping them sticking to the surface under animation.""". All X3D automatic texture coordinate methods are recalculated every frame. X3D doesn't specify to "calculate them once, and then keep sticky once mesh deforms". The way to implement "sticky" texture coordinates would be to just generate explicit UV in Blender, and export them, to explicit X3D TextureCoordinate values. These are "sticky".X3D also doesn't have any automatic generation resembling Blender "Box", "Sphere" or "Tube". Such values would also have to be just set as UV coordinates, and exported to explicit X3D TextureCoordinate values.
These are some notes "around" the questions you asked :) I am not sure did I answer everything properly, please ask more if anything is unclear above :)
related: https://community.osarch.org/discussion/1365/blenderbim-storing-reference-drawing-with-the-ifc#latest
Interesting, are there any IFC viewers capable of displaying this? I like pictures more
Not that i know of.