Here is what shipped and where you can use it today. On Sept 18, 2025, Luma AI launched Ray3 with Adobe as the first external partner. Ray3 is live inside Adobe Firefly and on Luma’s Dream Machine, with a two-week promo window where paid Firefly and Creative Cloud Pro plans get unlimited Ray3 generations through Oct 1. Multiple outlets repeated that promo window, and Adobe’s post makes the packaging clear: during the window, Ray3 sits inside Firefly and is also available on Dream Machine, with Firefly framed as the first external partner integration.
What Ray3 is trying to be
Ray3 is being positioned as a professional video generation model built for real pipelines. There are three headline claims:
- Reasoning video model. Luma and the press describe Ray3 as a model that thinks like a creative partner with a multimodal reasoning system that plans and refines scenes for better coherence.
- Native HDR output. Ray3 is the first model marketed as producing studio-grade 16-bit HDR with EXR output. This is the technical stake in the ground for film, ads, and game workflows that expect ACES-friendly files and high dynamic range grading headroom.
- Faster iteration with Draft Mode. Luma’s Draft is advertised as up to 5x faster and 5x cheaper to explore shots before promoting to high fidelity output.
There are workflow controls and tools on day one: two-keyframe control using first and last frames, immediate support for vertical, horizontal, and ultra-wide panoramas, plus Firefly Boards integration for ideation and storyboarding. Luma’s site and screenshots show image-to-video, Extend, and Loop, with high fidelity output and neural upscaling for delivery formats. Launch coverage also calls out 10-second clip length, with 1080p native output and neural 4K upscaling in the pipeline.
Availability: the confusion and the reality
Social posts on launch day created mixed impressions. Some called Ray3 an Adobe partner model exclusive to Firefly. Others said it was also available for free on Dream Machine. Adobe’s post resolves it. During the promo window, Ray3 is available in Firefly and on Dream Machine, with Firefly positioned as the first external partner and paid Firefly and Creative Cloud Pro users receiving unlimited Ray3 generations through Oct 1. Outside that window, broader availability and pricing will matter more, but that was not spelled out in the items I reviewed.
Spec notes pulled from launch materials
- Clip length up to 10 seconds at launch.
- Native 16-bit HDR with EXR output positioned for ACES2065-1 oriented workflows.
- Supports vertical, horizontal, and ultra-wide panoramic formats on day one.
- Two-keyframe control for start and end frames, plus image-to-video, Extend, and Loop tools.
- Firefly Boards for planning and handoff into Premiere Pro and the rest of Creative Cloud.
- Firefly generations include Content Credentials for provenance.
What early content actually shows
The evidence right now is launch clips, side-by-sides, and UI screenshots. I saw repeated examples of HDR-grade contrast, fewer warps in motion, and better temporal consistency than prior public models. That does not substitute for independent benchmarks or codecs-and-latency sheets, which are not out yet in the surveyed sources. The gap is obvious: no public tables for motion metrics, no end-to-end latency data, and no frame rate or codec matrices beyond credit schedules and basic output notes. For professional work, those numbers matter.
Pricing on Dream Machine: credits and burn rate
Luma updated Dream Machine with Ray3 credit schedules, including Draft, 540p, 720p, HDR, and HDR+EXR. A few concrete examples:
- Ray3 720p HDR: 1,280 credits for 5 seconds, 2,560 credits for 10 seconds
- Ray3 720p HDR+EXR: 2,240 credits for 5 seconds, 4,480 credits for 10 seconds
One early tester reported a 5-second 720p HDR+EXR clip at 2,240 credits cost about $6.72, and flagged how quickly spend adds up. Luma lists Top-Up credits starting at $4 for 1,200 credits, and there are Fast and Relaxed modes that trade time for throughput. Expect to audit quality settings tightly and use Draft Mode to avoid burning credits before you have a confirmed idea.
Use Draft Mode first. Promote only finalists to HDR or HDR+EXR to control spend.
If you want a rough cash estimate, the tester’s example implies about $0.003 per credit for that transaction. At that rate, 1,280 credits for a 5-second 720p HDR clip is roughly $3.84, and 2,240 credits for a 5-second HDR+EXR clip is roughly $6.72. Using Luma’s $4 for 1,200 Top-Up credits works out to about $0.00333 per credit, so actual cost will vary by your bundle and region. The key point is the gradient between HDR and HDR+EXR. HDR+EXR costs about 75 percent more credits per second than HDR in these examples, so reserve it for shots that truly need grading headroom or VFX integration.
Why HDR EXR matters
Most AI video tools top out at 8-bit SDR delivery or a thin HDR claim without the pipeline formats that colorists expect. EXR with 16-bit precision opens the door to color-managed finishing in ACES, aggressive grading, and high-nit mastering for HDR displays without tearing the image apart. If the model’s temporal consistency holds up in longer beats and the highlight roll-off looks clean under grades, that is a real improvement for teams that want to keep AI shots in the same DI as camera originals. The caveat is obvious: we still need third-party tests to confirm highlight handling and banding behavior across realistic scenes, not just curated launch clips.
Reasoning and control: how much is new vs. re-labeled
Reasoning is the other big claim. The pitch is that Ray3 interprets prompts more like a collaborator and maintains a plan across frames, which should reduce odd scene jumps and physics glitches. A two-keyframe setup gives you structured control without scripting every beat. Vertical, horizontal, and ultra-wide formats help match platform delivery from social to scope-friendly cuts. These are all practical features. The question is reliability under messy prompts and hybrid workflows where you combine image-to-video, Extend, and Loop. The launch media looks solid, but I want to see prompt drift tests, motion persistence across scene actions, and tracking stability on fast camera moves before calling it a solved problem.
How to approach Ray3 during the promo window
- Start with Draft Mode. Explore multiple beats at low cost. Lock the shot idea, then promote a few finalists to Hi-Fi.
- Use two-keyframe control. Set start and end frames with clear intent. It produces more predictable motion arcs than freeform prompting alone.
- Pick formats by destination. Keep social work in vertical or horizontal 1080 for speed. Save HDR+EXR for shots that need grading headroom or integration into a filmed pipeline.
- Plan for budget on Dream Machine. If you are outside Firefly’s unlimited window, a few HDR+EXR iterations can burn through credits quickly. Track cost per second and quality of each pass.
- Keep an eye on codecs and frame rates. Public sheets are missing. If you require a specific mezzanine format, do not assume it is there until confirmed.
- Keep provenance intact. Firefly adds Content Credentials, which some clients and platforms now require.
Practical ACES notes for HDR workflows
If Ray3’s EXR exports are aligned with ACES2065-1, you should be able to ingest into an ACES-based timeline and grade alongside camera footage without remapping hacks. The benefit is consistent color management through the pipeline. The risks are familiar: banding if bit depth is mishandled, clipped highlights if tone mapping is wrong, and mismatched gamut if the export is not plainly documented. Until a full spec sheet is public, run a quick battery of tests: a skin tone chart with mixed lighting, a bright specular highlight sequence, a deep shadow shot with subtle gradients, and a saturated neon scene. Push and pull by several stops to check roll-off and banding. If it survives those moves, you have a working baseline.
Evidence and gaps
- Evidence: posted demo clips, screenshots, and creator threads showing HDR look, temporal stability, and physics that appear improved over prior public models.
- Gaps: no public benchmark tables, no independent latency metrics, no codec and frame rate matrices beyond credit schedules, no controlled third-party tests of temporal coherence or HDR fidelity.
Who this helps on day one
- Filmmakers and editors who want fast previz and b-roll that can survive grading.
- Advertisers and motion teams generating short, cinematic inserts for 5 to 10 second beats.
- Game studios mocking up in-engine shots and mood pieces for pitch reviews.
- Social teams who need vertical or horizontal quick cuts with better motion stability than prior models.
Control and iteration strategy
Draft-to-final is the right mental model for Ray3. Start wide in Draft Mode, collect 6 to 12 options, then trim. Use two-keyframe control to nail the move, and only then consider promoting to HDR or HDR+EXR. For teams that want more context on speed-first tools vs finishing tools, I have notes on Lucy‑14B on Fal.ai and Krea’s Realtime Sculpted‑Video. If you want a check on pricing and throughput in another high-end model, see Google Veo 3. If your pipeline needs instruction-guided editing after generation, look at Decart Lucy Edit.
Firefly integration and Creative Cloud flow
Ray3 inside Firefly sits alongside Boards for ideation and makes it simple to hand off to Premiere Pro and the rest of Creative Cloud. Firefly generations include Content Credentials, which will matter for clients and platforms that require provenance. If you are running an Adobe-centric pipeline, this is the cleanest path for early access during the promo window. The dual availability with Dream Machine means teams that prefer Luma’s interface can still participate, though without the Firefly unlimited perk outside Adobe’s plans.
A compact test plan for week one
- Set three prompts per use case: one cinematic, one product macro, one action move. Keep prompts consistent across Draft and Hi-Fi.
- Run Draft Mode batches, save the top two per prompt based on motion stability and composition.
- Promote finalists to HDR, then to HDR+EXR if you plan to grade. Track credits, time to first frame, and total latency.
- Grade the EXR in ACES with basic pushes and pulls. Check for banding, clipped speculars, and color noise in deep shadows.
- Export to your delivery mezzanine and confirm that any frame rate or codec needs are supported without roundtrips.
What I want to see next
- Independent motion and temporal tests. Quantify scene coherence across 5 to 10 seconds, with object persistence and occlusion handling.
- HDR fidelity under grade. Stress test highlight detail retention, skin tone stability, and banding in 16-bit EXR exports.
- Latency and throughput. Real numbers for Draft and Hi-Fi in Fast vs Relaxed, and how that scales with queue depth.
- Codec and frame rate sheets. Simple, public matrices for container, codec, bit depth, and frame rates at each quality tier.
Bottom line
Ray3 brings two things that matter for serious work: a credible shot at better planning and motion, and native 16-bit HDR EXR for finishing. The launch packaging is simple for two weeks. Firefly users on paid plans can run unlimited Ray3 through Oct 1, and Dream Machine also has Ray3 during the same window with credits for everyone else. The costs on Dream Machine add up fast at HDR and HDR+EXR, so Draft first and promote only the finalists that deserve it. The missing pieces are benchmarks and spec sheets. Until those arrive, the strongest signal is still what we can see in the posted clips.

