Model Detail
Flux-Uncensored-V2
—PipeMFL-240K: A Large-scale Dataset and Benchmark for Object Detection in Pipeline Magnetic Flux Leakage Imaging
arXiv:2602.07044v2 Announce Type: replace-cross Abstract: Pipeline integrity is critical to industrial safety and environmental protection, with Magnetic Flux Leakage (MFL) detection being a primary non-destructive testing technology. Despite the promise of deep learning for automating MFL interpret
Flux Attention: Context-Aware Hybrid Attention for Efficient LLMs Inference
arXiv:2604.07394v1 Announce Type: cross Abstract: The quadratic computational complexity of standard attention mechanisms presents a severe scalability bottleneck for LLMs in long-context scenarios. While hybrid attention mechanisms combining Full Attention (FA) and Sparse Attention (SA) offer a pot
CellFluxRL: Biologically-Constrained Virtual Cell Modeling via Reinforcement Learning
arXiv:2603.21743v3 Announce Type: replace Abstract: Building virtual cells with generative models to simulate cellular behavior in silico is emerging as a promising paradigm for accelerating drug discovery. However, prior image-based generative approaches can produce implausible cell images that vio
LumaFlux: Lifting 8-Bit Worlds to HDR Reality with Physically-Guided Diffusion Transformers
arXiv:2604.02787v1 Announce Type: cross Abstract: The rapid adoption of HDR-capable devices has created a pressing need to convert the 8-bit Standard Dynamic Range (SDR) content into perceptually and physically accurate 10-bit High Dynamic Range (HDR). Existing inverse tone-mapping (ITM) methods oft
FluxMoE: Decoupling Expert Residency for High-Performance MoE Serving
arXiv:2604.02715v1 Announce Type: new Abstract: Mixture-of-Experts (MoE) models have become a dominant paradigm for scaling large language models, but their rapidly growing parameter sizes introduce a fundamental inefficiency during inference: most expert weights remain idle in GPU memory while comp