Star Citizen Wiki

Welcome to the wiki! Be sure to read the rules before editing, and if you have any questions or confusions, feel free to contact us!

READ MORE

Star Citizen Wiki
Register
Advertisement

A head-up display or heads-up display (also known as a HUD) is any transparent display that presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from a pilot being able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments[1].

Developer’s comments[]

Star Citizen will feature a super powerful, flexible and immersive HUD. Here some tidbits from the HUD developer himself:

It is a radically different from the standard game HUD formula that’s prevailed for the last decade. Taking inspiration from previous Chris Roberts ship controls, nothing in our work is similar to the templates you’ve seen in more recent games; it’s been engineered from the ground up to deliver an unparalleled degree of control and detail to the pilot. Chris is an inspiring guy to work for, and I’m honored to have been picked for the job. We aren’t cutting any corners. All the elements of the HUD systems we’ve developed are based on procedural algorithms and can be customized to different ships and styles to a huge degree of depth and flexibility, keeping the scope of the universe in mind. My task is to pack as much useful information as possible into tight, clear displays, to give pilots those crucial visual cues that can mean the difference between life and death in a dogfight. The HUD will be your friend, and learning to read it reflexively will shave crucial milliseconds off your reaction time. Sometimes it feels more like I’m developing an avionics package for the military than for a game![2]

–Josh Strike

Star_Citizen_-_Immersion

Star Citizen - Immersion

Immersion Video

You can catch a tiny bit of it in the “Immersion” video which shows some of the earliest integrations of marks, radar and ship displays when you climb into the cockpit. Although this is just the tip of the iceberg, you’ll notice the screen displays there look quite a bit different from anything in the other videos. What you’re seeing (for HUD-skeptics) are functional components with a lot of data layers and controls built in, but they aren’t engaged yet in that video.

Mea culpa, by the way. I was trying to do something that hadn’t been done before, which was part of the reason we missed our integration target, which is most of the reason why the targeting reticles and gun pips weren’t in for the live demo (which is why Chris missed the enemy ship — and why some people seem to doubt the HUD exists). But I promise the end result is well worth it; you’ve never seen this stuff in a HUD before, because we had to rewrite parts of the CryEngine/Scaleform integration code and roll a whole bunch of new graphics methods just to make it work.[3]

–Josh Strike


3D – yes. Looks very, very cool in stereoscopic. [...] I should add that I’m not authorized to answer all questions, but yes, a great degree of configurability is built into the HUD from the ground floor up. And the design ethos is to provide maximum information in a compact space during battle. This includes movable and configurable displays, of course. I’m on the graphics side, so questions related to integration (and hardware compatibility) should generally be directed to my superiors.[4]

–Josh Strike


I’m seeing a lot of questions about 2D vs. 3D displays, and about 3D HMDs vs. cockpit-projected HUDs. In fact, Chris made the decision to use all three types in Star Citizen from the beginning.

There’s a fixed holographic projection within the cockpit, which overlays things that don’t move with your head, e.g. velocity, acceleration, attitude, altitude, heading and targeting data. This projection has depth for greater effect (along the lines of what you see in that Scaleform promotional video), and also uses some (non-essential) stereoscopic depth cues to assist the pilot in reading certain situations more quickly. You don’t need an Oculus to get all the same info, but I’m not saying the depth cues won’t give you a slight edge.

In addition to that, there’s a separate projection on the inside of the pilot’s helmet which can be loaded up with with deeper data sets (e.g. ship status, weapons selection, power balancing, navigation maps, communications, etc). This HMD projection stays in your field of vision when you turn your head.

Finally, elements from both of these projections can be shunted to the flat LCD displays or brought back up to their respective projection, and have been designed to shift shape, color and opacity, and/or break into separate elements, depending on whether they’re being displayed on a flat screen or holographically. So yeah, there was definitely some inspiration from Minority Report and Iron Man, but at the same time these elements are very much in the CR space sim style, and my primary goal is to remain true to his original vision.

I should stress that these projection layers exist, and about 80% of the gizmos are fully functional; in tests, they have been added to the projections and screens and run successfully with dummy data. The part that’s not yet complete is the full integration that sends active environmental data to the components. The HUD is a platform in itself with 62 custom classes and a 20-page API manual so far. There are hundreds of data points that need to be connected up to the ship’s systems, so we’re still in the process of getting the cockpit fully “wired”.

And no, we’re not deviating from the classic polar-mapped radar screen! But we’ve added some nice touches like sector heat mapping, and ship-on-your-tail alerts.

Additionally, someone asked me about incorrect and/or damaged displays. Every element in our HUD responds to damage. As Chris built the fly-by-wire system to procedurally handle an infinite range of ship states based on damage to various components, that philosophy was extended to have damage also rendered procedurally in the HUD. Just as one example, text in the HUD is not pre-rendered or even generated on the fly as a block; it’s printed procedurally to the displays one character at a time, with a greater likelihood of transcription error (or garbled transmission) depending on specific damage to your avionics or communications systems. Response to damage is a bedrock feature of every element in the HUD and has a negative impact on boot times, data latency and accuracy. If this sounds like something not recommended in a scaleform project, it isn’t — and they said it was crazy — but we’ve refactored, honed and optimized it to work.[4]

–Josh Strike

References[]

  1. Wikipedia-logo-v2 {{{text}}} article on the English Wikipedia
  2. robertsspaceindustries.com "Hello (from one of the devs), post #25804". (Note: Page is no longer accessible on the RSI website)
  3. robertsspaceindustries.com "Hello (from one of the devs), post #25906". (Note: Page is no longer accessible on the RSI website)
  4. 4.0 4.1 robertsspaceindustries.com "Forum thread “josh strike?”, post #78948". (Note: Page is no longer accessible on the RSI website)
Advertisement