• ASIStudio
  • How to get a simple live view with ASI Studio

Hi folks, I'm very unfamiliar with astrophotography and electronics, so please be gentle.
I have a new ZWO ASI 120 MC-S and running ASIStudio from my MacBook Pro with Catalina 10.15.7.
I'm an astronomy outreach professional and NOT interested in astrophotography per se, but rather want to use the ASI camera to capture live sky views that I will project onto a cinema screen for an audience who will be listening to me talk about the targets as I remotely move my scope across the sky.
My main problem is understanding the basics of how to simply get an image to appear in the software! I don't even know if I should use Planetary Imaging, Deep Sky Imaging, Live Stacking or Deep Sky Stacking?
Can someone please walk me through the process?
Thank you

  • w7ay replied to this.

    Scubayorp I don't even know if I should use Planetary Imaging, Deep Sky Imaging, Live Stacking or Deep Sky Stacking?

    Planetary Imaging -- very bright objects that require short (10-100 milliseconds) exposures, and high gain (high gain to reduce read noise relative to target noise). Planets, Moon, Sun using solar filters or Etalons. You then stack these images using planetary stacking programs such as AutoStakkert to achieve very high resolution images from the multiple short exposures, each of which freezes "seeing" (what astronomers call "lucky imaging.") For your outreach, you can use this mode for real time Moon and Sun without stacking to get passable images since you don't need as much resolution to impress the unwashed. You can make out some blurry bands of Jupiter and a blobby ring of Saturn (forget about Cassini division) without stacking.

    Deep Sky Imaging -- dim objects (or even not so dim, like M31). Loooong exposures, plus low gain to maximize dynamic range (like for M42). For most suburban skies (Bortle 5 to 7), exposures are typically 1 minute to 3 minutes before the shot noise from the sky far exceeds the camera dark current and read noise. Often you have to mix very short exposures with long exposures to get enough dynamic range (for example seeing M42 nebulosity without the trapezium region saturating the image). If you need even longer exposures (and you will), you simply save each of these minutes-long exposures and then stack these shorter exposures to form hours-long exposures by using dedicated software. For your purpose (instant gratification), it is however a mode that only works for brighter objects. But you can even get visible images of things like the North America Nebula and a hint of the Cygnus Loop without stacking.

    Live stacking -- as it says "live." I.e., instant gratification. The above process where the stacking is done in real time, and you see progressively better images as more exposures are collected. Very inferior stacking compared to real stacking programs, and more prone to tracking and field rotations errors over the course of an hour. Star colors are also inferior to what something like Astro Pixel Processor star color tools can give you, for example. That said, it probably suits what you want to do the best, but terrible if you are serious about your images. The unwashed that you are outreaching won't know the difference between a good image and a poor image -- they just go "ooooh, Orion Nebula -- look, shiny!"

    Deep Sky Stacking -- a program that stacks the images (offline) that you got from Deep Sky Imaging earlier. If you are even halfway serious about astrophotography, do yourself a favor and buy a real program like Astro Pixel Processor or PixInSight instead. The Deep Sky Stacker in ASI Studio is rudimentary, at best -- you get what you pay for.

    All that said, the ASI120 class camera will give you extremely disappointing images. Too low dynamic range, too noisy, shallow wells, sensor size covering a very small FOV, etc.

    Chen

      w7ay
      Hmmmm, thank you so much, Chen. I'm staring at a steep learning curve and now not sure I haven't bought the wrong thing. Problem is cost - I'm a low income earner on a pension.
      As you say, the hoi polloi are easily satisfied but that is exactly my main audience! I love it when they think I'm showing and telling them something amazing. 😉 Takin' it to the ppl. 🙂
      However, I'm really worried that you say my images will be extremely disappointing.
      My scope is a 14" alt-az Go-To Dob which I can slew remotely, with my iPhone, while I wander amongst the 200-300 seated masses, some meters away.
      After initial alignment at the scope, I'm hoping to use SynScan Pro on my iPhone to remotely go to a target and then centre that target from my iPhone as I waffle on about what we're all seeing on the cinema screen. Targets are typically sparkly things like the Moon, planets, Messiers, globular clusters, multiple stars etc.
      I'll also be carrying my iPad tethered wirelessly to my MacBook Pro which is at the scope taking the camera feed by cable. I want to use something like TeamViewer to control the laptop screen from the iPad.
      Via HDMI cable, the laptop will output the ASIStudio image to the cinema's system and onto the big screen.
      Then I want to be able to remotely G0-To my second target and do the same all over again.
      Seems complicated but I can't think of an alternative.

      Write a Reply...