Neural BRDF synthesis for physically-based material generation. Text-to-material and image-to-material workflows for lookdev artists.
Traditional material authoring requires manual tuning of albedo, roughness, normal, and metallic maps. This is time-consuming and requires deep technical knowledge of how light interacts with surfaces.
Lookdev artists often work from reference images—photos of real materials they want to recreate. The question: can neural networks learn material properties from these references and generate production-ready texture sets?
Proof-of-concept complete for basic material classes: wood, metal, concrete, stone. The system successfully generates plausible texture maps that render correctly under different lighting conditions.
Currently expanding training dataset to include complex materials: translucent (skin, wax), iridescent (oil slick, butterfly wings), and anisotropic (brushed metal, hair). These require more sophisticated BRDF representations.
Exploring integration with USD workflows for direct Hydra rendering. The goal is seamless material generation within existing DCC tools rather than standalone application.
The breakthrough isn't replacing material artists—it's accelerating the iteration loop. Artists can generate a dozen material variations in seconds, select the best starting point, then refine using traditional tools. The neural network handles the "80% there" first pass; artists add the final 20% that makes it production-ready.