Understanding how textures and shaders work in IFC

edited February 2022 in General

Recently, @yorgunkirmizi reached out to discuss how textures, shaders, texture coordinates, and all that type of stuff works in IFC. We've made some serious progress deciphering what is and isn't possible, and so I thought I'd open this thread so that others can also get involved or at least keep tabs on the progress.

As far as we can tell, textures is one of the more famous but unsupported aspects of IFC. The FZKViewer has some preliminary texture support, as in, for one of the types of textures, Jakob Beetz has managed to get it to show up on a primitive object. The BlenderBIM Add-on supports external references, which is a bit of a cheat to get arbitrarily complex textures, materials, and lighting, since it defers the texture definition outside IFC. So we want to do better and support it properly, which also includes providing test cases for the rest of the industry so they can catch up too.

For contrast, let's start with a how a representation is styled with a surface colour:

IfcShapeRepresentation --> IfcFacetedBrep <-- IfcStyledItem --> IfcSurfaceStyle --> IfcSurfaceStyleShading

The IfcSurfaceStyleShading includes basic viewport colours, diffuse component, specular component, and some transmission data. Think of it as your principled shader with some flashbacks to still allow "phong" to be chosen as a shader type.

For textures, instead of using an IfcSurfaceStyleShading, we use IfcSurfaceStyleWithTextures instead. This then lets us stack a series of IfcSurfaceTexture textures. Think of this stack as the various nodes that combine to create your colour mixes, diffuse maps, specular maps, bump maps, and so on.

IfcShapeRepresentation --> IfcFacetedBrep <-- IfcStyledItem --> IfcSurfaceStyle --> IfcSurfaceStyleWithTextures -> IfcSurfaceTexture (L[1:?])

There are three types of IfcSurfaceTexture: Blobs, Images, and Pixels. These are three different ways to reference your raw image data. Either an embedded blob, a URI pointing to an image, or an array of pixel RGBs. Regardless of which imaging reference you choose, all IfcSurfaceTextures share the same ability to combine in the stack with various effects and reference texture coordinates.

In theory, there seems to be enough flexibility with the parameters to record any texture system, like Cycles or Eevee nodes, or establishing your own convention. However, IFC's texturing capabilities were derived from ISO/IES 19775-1.2:2008 X3D Architecture and base components Edition 2, Part 1. So the default is to support how shaders work in X3D. In short: to understand textures in IFC, you first need to understand how shaders work in X3D.

There are some documents you want to read first:

@yorgunkirmizi and I have been working through this exercise:

  1. Understanding the X3D specification and test cases
  2. Recreating the test cases in Blender, to understand how this seemingly dated standard is reflected in more modern rendering engines. We will create node trees for each of the base cases, describing what is and isn't possible.
  3. Converting these test cases to IFC test cases, showing how you can recreate both the X3D definitions and the Blender material node trees in IFC.
  4. Identifying where the existing IFC textures fall short, and whether or not any existing solutions in the X3D world may help.
  5. Creating typical basic material setups that a designer might use, such as a basic diffuse map, all the maps from a PBR workflow, and showing how these are represented in IFC.
  6. Clearly documenting where it falls short. Obvious scenarios that come to mind are procedural textures, blackbody colour temperatures, RGB curves, and SSS. Some of these can be supported fairly easily by simply describing new conventions (e.g. just add SSSMAP and done!) to bring it up to more recent standards. For others, we might fall back to external definitions, or define a render engine specific convention.
    7 . Write code! Make it happen! Textures! Yeah!

We've just finishing up on step 2, and made a start on step 3.

paulleeGorgiousbrunopostlekrandeAceJanFCadGiruNigelLaurensJNJesusbilland 5 others.


  • We've just caught up to step 6. Before we begin step 7, we will have the benefit of consulting with Michalis Kamburelis, the guru from the open source X3D world where we will present our findings and ask his advice on whether we messed up.

    It takes too long to document what we've done so far, but here's a dump that might make some sense.

    # Recreating the modes_and_sources test cases, row 1, col 1
    #2=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/squirrel.png')
    # Recreating the modes_and_sources test cases, row 1, col 2
    #2=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'ADD',$,('','','1 1 1','1'),'data/squirrel.png')
    # Recreating the modes_and_sources test cases, row 1, col 3
    #2=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1'),'data/squirrel.png')
    # Current X3D behaviour
    # In the default scenario of two opaque textures, the subtract mode applies to RGBA
    # The alpha channel then becomes 0: an invisible texture
    # This is one of the most common mistakes in X3D implementations
    # Recreating the modes_and_sources test cases, row 1, col 4
    #2=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'SUBTRACT',$,('','','1 1 1','1'),'data/squirrel.png')
    # This is highly uninuitive to artists and rendering engines
    # Artists typically expect the blending mode to only apply to RGB channels
    # Option 1:
    # By default, a single mode such as "SUBTRACT" only applies to the RGB.
    # Implicitly, the mode of "MODULATE" is always applied to the A channel.
    # This is the default in X3D, however note that it gives highly unintuitive results
    # I find this equally confusing to artists as blending RGBA (current X3D behaviour)
    #2=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'SUBTRACT',$,('','','1 1 1','1'),'data/squirrel.png')
    # Option 2:
    # By default, a single mode such as "SUBTRACT" only applies to the RGB.
    # Implicitly, the "REPLACE" mode is always applied to the A channel of the first texture
    # Subsequent A channel modes default to using the "ADD" mode
    #2=IfcImageTexture($,$,'MODULATE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'SUBTRACT',$,('','','1 1 1','1'),'data/squirrel.png')
    # Option 3:
    # By default, a single mode such as "SUBTRACT" applies to RGBA.
    # Always require two modes as per the X3D spec to explicitly distinguish between RGB and A
    # The two modes must be separated by a "/" character as per X3D proposal.
    # Whitespace in modes are ignored.
    # This is the same as option 2, but instead of implicit defaults, it must always be explicitly specified
    #2=IfcImageTexture($,$,'MODULATE / REPLACE',$,('','','1 1 1','1'),'data/pattern.png')
    #3=IfcImageTexture($,$,'SUBTRACT / ADD',$,('','','1 1 1','1'),'data/squirrel.png')
    appearance Appearance {
            texture MultiTexture {
              texture [
                ImageTexture { url "data/squirrel.png" }
                ImageTexture { url "data/pattern.png" }
              mode [ "REPLACE" "BLENDFACTORALPHA" ]
              alpha 0.2
            material Material { }
    # BLENDFACTORALPHA is one of the most common blending modes applied by artists
    #2=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1'),'data/squirrel.png')
    #3=IfcImageTexture($,$,'BLENDFACTORALPHA',$,('','','1 1 1','0.2'),'data/pattern.png')
    # Now let's look at common texturing situations
    # Scenario 1: One diffuse texture and nothing else
    #2=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1'),'data/squirrel.png')
    # Scenario 2: A diffuse texture mixing with a flat colour and nothing else
    #2=IfcImageTexture($,$,'BLENDFACTORALPHA',$,('FACTOR','','1 0 0','0.2'),'data/squirrel.png')
    # Scenario 3: Using a different blending mode with a factor.
    # In scenario 2, "REPLACE" is the implicit blending mode. However artists use a variety of modes.
    # X3D does not have enum values for these modes. We propose to add the following modes:
    # - ... in short, the name of the blending mode with the suffix "FACTORALPHA" to denote the factor source
    # X3D only has a handful of blending modes. Most notably REPLACE, MODULATE, ADD, SUBTRACT.
    # Artists commonly use many more blending modes. There is a long list.
    # However only handful are commonly used in textures. Here 's a list of common ones:
    # - DARKEN
    # - LIGHTEN
    # - SCREEN
    # - OVERLAY
    # https://en.wikipedia.org/wiki/Blend_modes shows some relevant equations
    #2=IfcImageTexture($,$,'MODULATEFACTORALPHA',$,('FACTOR','','1 0 0','0.2'),'data/squirrel.png')
    # Scenario 4: One diffuse texture and a bump map
    # The X3D v3.2 specification is not capable of storing different texture maps.
    # A proposal was made to extend X3D v3.2 to store texture maps in its Appearance entity.
    # In X3D v4, similar to the proposal, texture maps are stored in the Material entity.
    # Neither Appearance or Material have any relationship to IFC, so there is no obvious way to use this proposal in IFC.
    # X3D has a source parameter. This source parameter has possible values such as:
    # - ""
    # - "DIFFUSE"
    # - "SPECULAR"
    # - "FACTOR"
    # Despite the naming similarity to things like diffuse map or specular map, this is not the purpose.
    # Also note that the purpose of diffuse / specular is to combine the texture with the gouraud shaded object.
    # Gouraud shading is near obsolete. No one uses it.
    # Texture combination also happens prior to shading, not after. Only in special edge cases.
    # For this, we propose that only "" and "FACTOR" should be recommended in IFC.
    # Just as a new attribute was added in X3D, we propose a new parameter to be added.
    # This means there will be 5 parameters in the list. The 5th refers to the map type.
    # The list of valid map types is based off IFC2X3.
    # - BUMP,
    # - NORMAL,
    # - OPACITY,
    # - SHININESS,
    # - SPECULAR,
    # - DIFFUSE,
    # Note that we propose to add NORMAL. This is different to a BUMP (or height) map as it has XYZ values.
    # Note that SHININESS should be mentioned to be the inverse of roughness to help understanding.
    # Note that we propose to remove TRANSPARENCYMAP. It is the opposite of an OPACITY map and the
    # industry convention is to use 1 1 1 = OPAQUE.
    # Note that we have renamed TEXTURE to be DIFFUSE because that is the more correct term.
    # Note that REFLECTION is removed in favour of SPECULAR. They serve the same function. REFLECTION was
    # used back before PBR was popular.
    # There may be other uncommon or complex maps, like AO, ALBEDO, SSS, DISPLACEMENT, METAL, etc.
    # It is probably not necessary to list every possible map.
    #2=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','DIFFUSE'),'data/pattern.png')
    #3=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','BUMP'),'data/squirrel.png')
    # In arch viz, the shader is a combination of texture maps and a shader type with parameters.
    # The shader type is not part of the texture maps and is not in scope for IfcSurfaceTexture.
    # The shader type is defined in IfcSurfaceStyleRendering: FLAT, MATT, METAL, MIRROR, etc.
    # Depending on the shader type, certain texture maps may not apply.
    # For example, only a DIFFUSE map will affect a FLAT shader type.
    # Therefore for a full shader, IfcSurfaceStyleRendering and IfcSurfaceStyleWithTextures must be read together.
    # A shader may require extra parameters. E.G. a GLASS shader needs an IOR.
    # There is no place to put these right now.
    # Some parameters may be arbitrary and are specific to each engine. Others are common.
    # There are only a handful of popular engines which can be catered to on a case-by-case basis.
    # A few common ones are listed for consideration to be added as attributes to IfcSurfaceStyleRendering:
    # - Anisotropy
    # - Roughness (distinct from Specularity)
    # - IOR
    # Scenario 4: Every single texture map
    # At this point, after all the proposals, in theory we can do everything, with some engine-specific caveats.
    # Note that a normal map is used, not a bump, since they are mutually exclusive.
    #2=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','DIFFUSE'),'data/pattern.png')
    #3=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','NORMAL'),'data/squirrel.png')
    #4=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','OPACITY'),'data/squirrel.png')
    #5=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','SELFILLUMINATION'),'data/squirrel.png')
    #6=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','SHININESS'),'data/squirrel.png')
    #7=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','SPECULAR'),'data/squirrel.png')
    # Scenario 5: Diffuse and normal, but diffuse multiplies with a 50% factor of two images.
    #2=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','DIFFUSE'),'data/pattern.png')
    #3=IfcImageTexture($,$,'REPLACE',$,('','','1 1 1','1','NORMAL'),'data/squirrel.png')
    #4=IfcImageTexture($,$,'MODULATEFACTORALPHA',$,('','','1 1 1','0.5','DIFFUSE'),'data/squirrel.png')
  • Here are the node trees of scenario 4 and 5 (which I need to double check some of these but the concept is there):

  • edited February 2022

    So I think a number of us now have a pretty clear idea of what you can and can't do with texture node trees in IFC. Even better, we've now actually implemented the existing specification (to a certain point, missing some of the more obscure features) for export. To our knowledge we're the first people not only to implement this in IFC... but also one of the few people who care about it in the X3D world. IFC seemed to have chosen one of the most obscure and buggily implemented aspects of the X3D spec to reference.

    However, referencing X3D was a good move in general I feel, because X3D is much more involved in the texturing world than IFC is, so wherever X3D goes, IFC can follow. After talking with Michalis, an expert in X3D, he has confirmed our understanding and current implementation. More interestingly however, he also confirmed the various shortcomings of X3D, and the future direction. It seems as though X3D v4 (IFC references v3.2) is now mirroring the approach of glTF, so they will be compatible. This means that with a few tweaks, perhaps even without changing the current specification, we can make IFC4 compatible with X3D4 and glTF. What a huge achievement that would be! Not only for interoperability, but ensuring that the texturing systems are appropriate for modern artists and their tools.

    It's difficult to summarise the full extents of what we've learned, but here's an attempt to put it into terms that most 3D artists would appreciate.

    • Out of the box, IFC supports a diffuse texture.
    • Out of the box, IFC still prioritises biased lighting models, both in choosing the lighting model name (e.g. PHONG) as well as it attributes. Boo.
    • With an unofficial convention inspired from X3D, we can nominate a PBR material instead of a biased lighting model.
    • With an unofficial convention inspired from X3D, we can nominate other types of maps: specular, normal, etc. The usual stuff.
    • For different lighting models, some texture maps will have no effect. This is expected. For example, a normal map has no effect on a FLAT lighting model.
    • Your texture can be made out of multiple textures, combined one after another using blending modes.
    • Out of the box, officially not all blending modes are supported (e.g. multiply is, but overlay is not).
    • Out of the box, officially not all blending mode intensities are supported (e.g. mix with a factor of 0.2 is supported, but add or multiply cannot have a factor of 0.2, it is locked to 1 or 0)
    • With an unofficial convention we can support all blending modes and support factor intensities for all blending modes. Hooray!
    • Out of the box, you can blend with a flat colour, but only if that is the very first thing you do in your stack of textures. You cannot do it midway.
    • Maybe, with a very ugly hack I don't like, we can blend with a flat colour anywhere in the stack by defining a 1px texture in the IFC code. Nasty but in theory it works.
    • Out of the box, X3D (and so IFC) actually allows you to apply blending modes after lighting is calculated, not prior. This means that not only the IFC support mixing textures prior to rendering, it inadvertently also supports describing compositing. However spec only allows for you to composite on an image rendered with obsolete techniques like a Gouraud shader. So in practice this is impossible to implement with modern tools.
    • Out of the box, the rules about how blend modes are applied are not intuitive to artists. This is not necessarily a bug, as the implementation allows for the usual implementation we would expect in software like the GIMP / Photoshop / Krita / Blender / etc, but I highly suspect other implementers will get confused and we might get buggy implementations.
    • Out of the box, X3D v3.2 also has confusing rules about treating grayscale images differently by default from RGB textures. Again, not strictly a bug, but very likely implementers will get confused and we will get buggy implementations. This has been fixed in X3D v4.
    • Movie textures are not supported in IFC, but are supported in X3D.
    • Procedural textures (noise, brick, musgrave, voronoi) are not supported
    • UV coordinates are supported
    • Generated coordinates are supported (X3D calls this COORD)
    • Camera coordinates are supported (X3D calls this either CAMERASPACEPOSITION or COORD-EYE with seeming equivalency)
    • Out of the box X3D (and therefore IFC) supports a number of other coordinate types. Some of these are downright crazy and nobody would ever use them. I don't think anyone would implement them. As a result of our conversation, we suspect these will see a culling in X3D. For example, when have you ever taken geometry vertex coordinates then applied a noise with a user configurable parameter to distort your texture coordinates?
    • Using a transformation mapping to shift texture coordinates is supported
    • Some other modifiers like some forms of vector math and colour invert is supported.
    • Other common modifiers like hue / saturation, colour curves, RGB to BW, etc, are not supported.

    Here are some links Michalis pointed us to for the record:

    I hope to put forward a series of proposals hopefully in time for IFC 4.3 which will align IFC, X3D 4, and glTF.

  • Thank you @Moult for the great conversation! I confirm everything in the above summary :)

    Let me add a note summarizing the X3Dv4 approach to textures:

    X3D v4 has 2 places to where you can put textures:

    1. You can put them in material fields, like Material.diffuseTexture, Material.specularTexture, PhysicalMaterial.baseTexture, PhysicalMaterial.metallicRoughnessTexture. See X3D spec of the XxxMaterial nodes, they all have a number of xxxTexture fields: https://www.web3d.org/specifications/X3Dv4Draft/ISO-IEC19775-1v4-CD/Part01/components/shape.html#Material

    2. You can put a texture (possibly MultiTexture) in Appearance.texture. See https://www.web3d.org/specifications/X3Dv4Draft/ISO-IEC19775-1v4-CD/Part01/components/shape.html#Appearance .

    AD 1 is more modern approach, matching at least glTF and Collada capabilities, and matching how the authoring tools like Blender work. Each texture clearly affects a specific material parameter.

    AD 2 is a backward-compatible approach. We don't plan to remove it in any foreseeable future... Though we encourage to use AD 1, at least for simple textures (when you don't need MultiTexture).

    • It interacts with approach AD 1 in a clear way: if you use Appearance.texture, and Material.diffuseTexture is empty, then Appearance.texture acts exactly as Material.diffuseTexture. (There are similar rules for PhysicalMaterial.baseTexture, UnlitMaterial.emissiveTexture.)

    • AD 2 is also the (only) gateway to continue using MultiTexture in X3Dv4. So you can compose a number of textures to calculate diffuse parameter (for Phong) or base (for PBR).

  • Cheers @michalis ! One thing I'd also like to investigate is whether or not we can align X3D, glTF, and IFC for texture UV coordinates. Is this something X3D and glTF have consensus on? Currently, IFC is actually missing texture UV coordinate support for non-triangulated meshes, so the time is ripe for some proposals whilst they're fixing that aspect.

    Further reading on the current IFC UV coordinate shortcomings: https://github.com/buildingSMART/IFC4.3.x-development/issues/135

  • @Moult said:
    Cheers @michalis ! One thing I'd also like to investigate is whether or not we can align X3D, glTF, and IFC for texture UV coordinates. Is this something X3D and glTF have consensus on? Currently, IFC is actually missing texture UV coordinate support for non-triangulated meshes, so the time is ripe for some proposals whilst they're fixing that aspect.

    Further reading on the current IFC UV coordinate shortcomings: https://github.com/buildingSMART/IFC4.3.x-development/issues/135

    In case of texture coordinates, X3D has actually a bit more than glTF.


    • allows only 2D texture coordinates, as it only allows 2D textures (see https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#meshes-overview - TEXCOORD_n is VEC2).

    • For each texture, glTF allows to say whether to use coordinates in TEXCOORD_0 or TEXCOORD_1 (implementations may support more, but glTF spec only requires to support 2 - """Client implementations SHOULD support at least two texture coordinate sets""").


    X3D allows more.

    • You can have many texture coordinates using MultiTextureCoordinate with children (you can specify as many as you like, though client implementations will certainly have some limits as underlying shaders have limits).

    • Texture coordinates may be 2D, 3D, 4D using TextureCoordinate, TextureCoordinate3D, TextureCoordinate4D, you can also auto-generate them using TextureCoordinateGenerator.

    • Each texture (in material) has a field that says which one to use, like Material.diffuseTextureMapping, PhysicalMaterial.baseTextureMapping and it should match the mapping field in one of the texture coordinates. If you use MultiTexture in Appearance.texture then it's a just backward compatible, i.e. textures in MultiTexture take coordinates from MultiTextureCoordinate in the same order.

    X3D can trivially express glTF case: just use mapping names TEXCOORD_0, TEXCOORD_1, in fields like Material.diffuseTextureMapping, PhysicalMaterial.baseTextureMapping, TextureCoordinate.mapping. This allows to express glTF is mostly straightforward way (this is what Castle Game Engine / view3dscene also does under the hood when you open glTF).

  • @michalis thanks! I've got quite a bit to learn. In X3D, we talked about how COORD maps to "Generated" texture coordinates with this definition:

    Automatically-generated texture coordinates from the vertex positions of the mesh without deformation, keeping them sticking to the surface under animation. Range from 0.0 to 1.0 over the bounding box of the undeformed mesh. See Texture Spaces for more information.

    However, does COORD have any effect on the projection method in the Image Texture node? (See link below) Does me choosing "Flat" "Box" "Sphere" or "Tube" have any impact? Is there an equivalent attribute for this projection in X3D? Being able to do Box projection for textures on extruded geometry in IFC is very important so I'd like to confirm the support for it :)

  • edited February 2022
    1. X3D TextureCoordinateGenerator.mode="COORD" means that we provide 3D texture coordinates based on mesh vertexes 3D positions. This is like Blender "Flat" mapping, but in 3D (but it only matters if you use 3D textures; for 2D textures, it just means that XY position determines UV).

    2. The default texture mapping on X3D IndexedFaceSet chooses 2 largest axes, and maps them to U and V texture coordinates. In CGE you can also request it explicitly on any geometry, by TextureCoordinateGenerator.mode="BOUNDS2D". This does not correspond to any Blender mode immediately. (though you could perhaps construct it using Blender modes, as it's like Blender "Flat" -- but with flexibility which 3D vertex axis corresponds to which 2D texture coordinate axis; I'm sure it's possible to configure Blender nodes to achieve this).

    3. The effects of TextureCoordinateGenerator and above-mentioned "default texture mapping on IndexedFaceSet" are recalculated every frame. (Though I recall we had some confusion about it on X3D mailing list a few years ago, and I'm not 100% sure whether this is true for all X3D browsers. But it should be true for Castle Game Engine / view3dscene and X3DOM.) So no X3D automatic texture coordinate mapping achieves the effect """keeping them sticking to the surface under animation.""". All X3D automatic texture coordinate methods are recalculated every frame. X3D doesn't specify to "calculate them once, and then keep sticky once mesh deforms". The way to implement "sticky" texture coordinates would be to just generate explicit UV in Blender, and export them, to explicit X3D TextureCoordinate values. These are "sticky".

    4. X3D also doesn't have any automatic generation resembling Blender "Box", "Sphere" or "Tube". Such values would also have to be just set as UV coordinates, and exported to explicit X3D TextureCoordinate values.

    These are some notes "around" the questions you asked :) I am not sure did I answer everything properly, please ask more if anything is unclear above :)

  • edited March 2023

    Interesting, are there any IFC viewers capable of displaying this? I like pictures more

  • Not that i know of.

Sign In or Register to comment.