See the previous post for details on the book. For some reason the originally scheduled sale on Wednesday didn’t fire.
See the previous post for details on the book. For some reason the originally scheduled sale on Wednesday didn’t fire.
When an amateur astronomer detects a modulated signal in the flash of a nova burst, the race is on to capture and decode the signal before it’s too late.
This science fiction short story is my first self-published eBook through Amazon, although it’s not my first sci-fi short story. Of course I’ve written plenty of technical magazine articles and papers, not to mention untold numbers of online posts, articles, how-tos, etc., but fiction is something that, beyond not just having time to write, I never felt I could dedicate the time to it, much less carry a story to completion. Thus, the most I’ve been willing to shoot for is a couple of short stories.
Leaving out some stories I wrote back in high school and college that I’d love to dust off if I could find them, my first relatively contemporary short story was “Goodbye Mr. Smith”, a sad and extremely short story from a sad time in my life, that surprisingly has components in common with the end of Dyson’s Radio. In “A Dark Matter” (currently awaiting a response from Analog) I discovered a narrative approach that worked well for me. Thus, when I came up with the idea for Dyson’s Radio, I knew exactly what I needed to say and where I needed to end up. However, even I didn’t know everything I’d discover along the way. The story ended up being quite a bit longer than I expected. It also became so current and topical that I decided it couldn’t wait for a long editorial review process. After a quick response from Asimov saying they wouldn’t be able to do it in time, I decided to go for direct publication to be sure that people had a chance to read it as it was intended to be experienced.
While I could easily have padded the story considerably by explaining many of the things mentioned, it was really written for the true space geeks among us who will immediately understand most of the references that the characters already know. Thus, if you’re here it’s probably because you already love the idea of space and space exploration and don’t need anyone talking down to you about it. However, there is a tremendous amount of background information I put together to keep the story plausible, and I envision writing a technical article on “Building Dyson’s Radio” to be published sometime in the future. And who knows. If I get brave enough I may even revisit this world. There’s certainly plenty to build on, but I just don’t know if I’m up to the challenge!
And if you’re lucky enough to be reading this today (December 6th, 2018), it’s currently free on Amazon. Merry Christmas!
I started writing this post just over a year ago and then realized I wasn’t done, so I didn’t get much further than a title and a first line that I tossed. However, at this point I think I’m about as done as I’m going to get with processing of images related to last year’s solar eclipse. I’d originally intended to design another poster or two, but at the moment I have too many other things going on and don’t feel all that creative in that direction. Perhaps before the next total eclipse rolls around in 2024 I’ll go back and look at it again.
At any rate, what I have accomplished just recently was to finally go back and finish a process I’d started last year when I discovered I’d captured the amazing lunar umbra shadow on the clouds in the following image. Since I apparently never made a blog post on THAT I’ll post it here too. In fact, it turns out that I never posted much of anything here about all the processed images from the eclipse, but I did make a few posts on Cloudy Nights.
So you can see all my images and time lapse video from the event, and details on the preparations and photography of the actual eclipse, but the one thing that took me quite a while to do (and even longer to go back and finish) was manually recording the automatic exposure settings of every frame of my time lapse video and then writing a script to adjust the exposure to back those out of each frame in Photoshop so that it’s as though the camera was set with a fixed exposure during the entire eclipse. While no computer monitor or file format is sufficient to cover the real dynamic range from full sunlight to evening night sky, this video will give you an idea of the change in illumination throughout the duration of the eclipse. The bright sky is completely blown out, but if you watch the landscape and foreground you can see things getting darker during the progression of the partial eclipse before things go almost completely black at totality. The goal is to give you an idea of what totality really felt like. Note you can still see the shadow of the moon against the sunlit clouds along the horizon. There are two versions of this in the solar eclipse party folder linked above, but this one has the actual eclipse progression overlaid in the corner once the image capturing starts at about 30 seconds into the video. You can compare the darkening of the sky to the progress of the Moon across the Sun. I hope you enjoy it!
In my ongoing quest to fully automate my observatory, I’ve been developing a number of ASCOM standard drivers for most of my custom equipment. The one thing I’ve wanted to do for some time is to find an alternative to Celestron’s NexRemote software that has to be manually started and brought out of hibernation and then manually put into hibernation and shut down before powering off the telescope. Even then, it was common for the NexRemote software to hang up and lose alignment often requiring a good half an hour or more to attempt to re-do an alignment remotely. It never made sense to me that with the availability of plate solving to determine exactly where the telescope OTA was pointing that I couldn’t just enter the current coordinates and have the scope aligned, given a good polar alignment of the mount.
What I needed was a software/driver that would automatically restore the motor position information on startup and automatically stop the drive and save the position information on shutdown. Not to mention, having the ability to take a picture and quickly align the mount to the current RA/DEC coordinates. Given that there was no other alternative forthcoming, I finally decide to go ahead and develop my own solution. Rather than writing another stand-alone program to take the place of NexRemote, which is just an emulator that runs standard hand control firmware, it made more sense just to build all the required functionality from the hand controller into a single telescope interface driver that could be used by any software built to use the ASCOM standard. Thus was born the NexStarEquatorial ASCOM driver.
Essentially I’ve implemented the important components of a NexStar hand controller directly into an ASCOM driver that communicates with the motor controllers directly through the PC or AUX port of the mount. Alternatively, the driver also supports communication through the serial port adapter in the base of the NexStar hand controller, although like NexRemote, it does not actually use the hand controller for anything other than the interface. The driver automatically saves and restores the current position of the mount on exit so that even if the mount is powered down in between sessions, when the driver starts back up, everything’s still where it was.
This driver is targeted towards full system automation and imaging, so it only supports the NexStar mounts in equatorial mode (thus the name). It also relies heavily on imaging software capable of plate solving and aligning automatically (i.e. something like Sequence Generator Pro). With a good polar alignment and starting point, there’s no need to do any sort of manual alignment at all, but it does support manual entry of coordinates from a plate solve to get the current reference if desired. I actually don’t use it because SGP handles that all for me.
By limiting the functionality to equatorial mounted scopes , including German Equatorial Mounts (GEMs) like the CGE, CGEM, CGE Pro, etc., and fork mounted telescopes like the NexStar GPS and CPC mounted on polar wedges, the requirements for the driver are greatly simplified. For a mount that’s perfectly polar aligned, the “azimuth” axis becomes the right ascension (RA) axis and simply needs to run as a clock motor, with one revolution every 24 hours. The other axis (altitude/elevation for a fork mount) just provides the North/South declination adjustment. The spherical coordinate system for an equatorial mounted scope is pretty straightforward then, with the current local sidereal time indicating what should be directly overhead along the meridian line dividing the eastern and western halves of the sky. The azimuth angle relative to the meridian just needs to be converted from 360 degrees to a 24 hour format and offset by that sidereal time to get the current right ascension, although for GEM mounts that also requires an east/west offset of 90 degrees between the axis position and the orientation of the OTA. The declination axis runs ±90° from the equator, but again for a GEM there’s a positive and negative side to the OTA orientation depending on which side of the pier the OTA is on and which side of the sky it points to. And of course in the southern hemisphere, the scope is mounted in the opposite direction and all axes have to run backwards. In all it’s just a pretty basic spherical geometry problem, but getting all the signs and combinations of behavior correct was still a bit tricky. Some of the Celestron mounts have their own quirks as well, with the CGE Pro (and possibly others) forcing a required 90 degree offset to the definition of “up” on the azimuth axis. And of course the ASCOM standard has its own quirks.
The following figure is the current setup dialog for configuring the driver. It sets up the communication method and mount type, as well as the mount location which is critical to the initial determination of the RA/DEC coordinate system. Subsequent homing and alignment/synchronization steps will set the actual position offsets once connected to the scope. In addition, ASCOM reports the details on the optical tube assembly, so that information can be entered here as well. Hovering over each control provides hints as to the function of each setting.
Once the ASCOM compliant client software connects to the scope through the driver, the following hand control paddle is displayed. The hand controller can be toggled to stay on top or closed and reopened from ORO logo icon in the system tray. The hand control provides traditional manual positioning control, targeting, position readout, tracking and parking control, as well as basic alignment functions. No positioning database is provided as that is expected to come from the client software. Other than an initial homing process to set the reference position of the mount the first time the driver is used, no further manual interaction should be required given suitable client software.
The driver has been made available for beta testing by interested users, and I’ve started this thread on Cloudy Nights for comments and feedback. I may eventually publish a more detailed manual here in the future, and of course expect to probably add a few more features over time as I have time and the desire to do so. I hope others will find this as useful as I have so far. It solves my automation problem amazingly well!
As we progressed through another galaxy season, I was constantly fighting to find a guide star that keeps my target framed. With my remote controlled off-axis guider (OAG) setup, if I can’t find a good star in the guide camera field of view (FOV) when the target is centered, my only option to shift the entire image position looking for a star, since I can’t rotate or otherwise move the guide camera relative to the target FOV. Thus I started thinking about upgrading from my QHY5L-II M to a new guide camera, ideally with both a larger FOV and higher sensitivity. Looking at the new crop of guide cameras, this led me to looking at cameras based on the Sony Starvis (star visibility) IMX178 sensor. Comparing with the MT9M034 sensor in the QHY5L-II, the IMX178 sensor has just over twice the total area (7.4 x 5 mm vs. 4.8 x 3.6 mm or 2.09x), but with 5x the pixel count. In addition to improving my guiding, another advantage to this 6.3 MP camera is that it would be a reasonable entry-level imager for narrow band imaging (e.g. H-alpha solar imaging, etc.).
After some online discussions with others on Cloudy Nights and elsewhere, but otherwise very little feedback on the best choice for my application, I ended up putting together a comparison table between the monochrome sensors currently being targeted for guide cameras. The following table is in order of increasing price. The IMX290 sensor has the best sensitivity in the new Starvis line, but with the 1080P 16×9 form factor, it suffers the same limitations I discovered when I tried using the ZWO ASI290MC for my all sky camera. It basically has the same area as the QHY5L-II, but with a less practical form factor (16×9 vs 4×3), so while I’d expect the sensitivity to help considerably, it wouldn’t increase the usable FOV. On the other hand, the IMX174 is a much bigger sensor, which presumably explains why cameras based on it are considerably more expensive, but it’s also based on older technology and has a much lower resolution. By simple virtue of the large pixel size, it can be expected to be more sensitive than the QHY5L-II, but the potential loss in guider resolution, not to mention the much higher cost, make this an impractical solution.
|Size (mm)||4.8 x 3.6||5.61 x 3.18||7.37 x 4.92||11.34 x 7.13|
|Pixel Size (μm)||3.75||2.9||2.4||5.86|
|Relative Pixel Area||1x||0.60x||0.41x||2.44x|
|Pixel Count (WxH)||1920 x 960||1936 x 1096||3072 x 2048||1936 x 1216|
|Total Pixels (MegaPixels)||1.2||2.1||6.3||2.4|
|Relative Pixel Count||1x||1.7x||5.1x||1.9x|
That really just leaves the original IMX178 I started looking at, which also happens to be the color sensor in the ZWO ASI178MC I finally integrated into my all sky camera. That just took me back to the original choice I was trying to decide between which camera vendor to use, QHY or ZWO. My imaging camera and previous guider are QHY, but I tried ZWO for the all sky camera, partly due to a better overall price point. However, the other factor here was the camera form factor. While both QHY and ZWO make cameras in the mini guide camera form factor, for this particular sensor, only QHY made a mini similar to the QHY5L-II I was replacing. For ZWO I’d have had to go with the same form factor as the all sky camera. While that’s essentially the same form factor as the StarShoot AG I’d been using before the QHY5L-II, I had reason to want to stay with the mini form factor, for weight if nothing else. Thus, even though it was about 10% more expensive I chose to go with the QHY5LIII178M over the ZWO ASI178MM. Only after receiving my new camera did I finally get some feedback from someone that the QHY version has better fixed pattern noise due to some additional circuitry QHY adds! Hopefully that justifies the added cost.
I ended up ordering from High Point Scientific simply because they had it in stock and OPT didn’t. Both were running the same 5% NEAF discount on top of QHY’s NEAF sale. High Point shipped it after one business day by USPS and it arrived two days later. High Point was also asking customers to create unboxing videos, so this is the first time I’ve attempted to capture video footage in addition to photos of the process.
The camera arrived nicely packaged in a “collector can” similar to that for the QHY5L-II.
The camera is a nice pretty blue anodized package.
The connector end of the camera has a USB 3.0-B style connector and a non-standard LEMO connector for the guide port due to the limited space. Given the compromise ZWO was making for this form factor by going to a micro USB 2.0 interface, I’d much rather have a USB-B connector than a USB 3.0 micro connector and a standard RJ-11 socket for the guide port.
I didn’t get the best of focus, but here’s my attempt o compare the size of the two sensors. You can see how much bigger the new IMX178 sensor is. The distance from the tube front appears the same, so adjusting the confocal ring on the new camera to the same distance from the front of the nosepiece had it almost perfectly focused when I put it into the OAG.
Here it is installed on the OAG. I forgot to take the picture while I was using it, so this was after parking. I definitely need to work on cleaning up my cables!
Before swapping cameras, I did a few tests to try to get an idea of the relative sensitivity of the two cameras. With the exception of the IMX290 and IMX178, which are in the same Starvis series, there was precious little comparable sensitivity data for the sensors, so this was the first chance to find out if I really gained anything in sensor sensitivity. I’d previously been imaging M81 and knew I had one good guide star visible there with the QHY5L-II. At one second exposure, that single star is visible in the FOV.
At five seconds, a second star is visible if I adjust the gamma to pull it out of the background.
That’s really the limit of usable guide star exposure length, but doubling it to ten seconds didn’t change anything. Swapping to the QHY5LIII178M, that first star is clearly visible at 0.2 second exposure. The star profile is a bit noisy, but part of that is because the camera’s not perfectly focused at this point. You can see how tiny the star is given PHD2 shows the entire FOV, so the scale here is smaller due to the much higher resolution.
At one second, the second star becomes visible. This image also gives you an idea of how much larger the overall FOV has become.
At two seconds a third star becomes visible, although it’s easier to see in this four second exposure.
At this point there’s some background noise becoming visible and the image became unusable at a five second exposure. In hindsight I believe this was due to having left the observatory monitoring cameras enabled so that their IR lights were creating a strong background lighting. I may add some new test images to the gallery later, but the general observation is that despite the smaller pixel size, the QHY5LIII178M is at least five times more sensitive than the QHY5L-II. From Sony’s documentation, the IMX290 is twice as sensitive as the IMX178, but at that point chances are good I would be sky glow limited in some cases!
Today was a sad day for the Texas Museum of Science and Technology as the temporary location in Cedar Park closed its doors. It would have been a great day had that been due to moving into a permanent facility, but given the absurdly high cost of the lease on the old indoor soccer building (that just had a dirt floor until six months or so ago) it was costing about $50k a month in donations just to keep the doors open. Due to that and some lingering financial consequences of some of the initial exhibits like the incredibly expensive Body Worlds that they opened with, in January the museum Board of Directors chose not to renew their lease and instead focus their efforts and any incoming donations on their permanent location.
On Friday night we had our final star party to suitably gloomy weather, kicked off by a dinner and live music in the parking lot. I’ve posted some pictures of the event including a last run through the museum to capture some artwork on the James Webb Space Telescope, and a rare picture of Mercury above Venus just over the corner of the museum building.
As far as the future of TXMOST, the goal is to continue looking for large corporate donors to support the building of a permanent location, and in the meantime possibly reopen on a smaller scale if a suitable location can be found. There have been ongoing negotiations with the City of Cedar Park to provide the land for the permanent location, but without a suitable backing to build the building, the City has not been willing to officially commit the land. So, if you’re here and reading this, certainly I’m sure TXMOST will appreciate anything you wish to donate, but you can also talk to your company HR and management to determine if your company has any sort of charitable outreach program that could get involved. Many of us work for high tech companies who are reliant on finding employees with science, technology, engineering, and math backgrounds. Certainly they should find value in helping to spark the interest in STEM in our next generations at an early age. The future will thank you for it!
As far as the Friday night star parties, I am working on lining up a new location to be able to keep doing the star parties and keep up the awareness of TXMOST. Watch here for more information on that.
My original NTSC all-sky camera had degraded to the point where it was useless. The main problem was that the acrylic dome had weathered and aged to a fogged up brown color, but I also decided it was time for a high resolution digital solution that would give me a live online feed. ZWO sells a number of planetary cameras that include a wide angle lens advertised for use as an all-sky camera. Originally I was looking at the $169 ASI120MC, but decided to go for a bit more horsepower with the USB 3.0 ASI290MC. At 1280×960, the ASI120MC is basically twice the ~640×480 of the NTSC camera (4x if you go by the total number of pixels), but the other big difference is the direct digital output vs. the analog video output being digitized by the security DVR. The ASI290MC was on sale for $299 at the time and being USB3.0 sounded like a good idea for real-time recording. It uses a 1936×1096 pixel sensor (basically a 1080P camera), that I thought had a wider field of view based on the published specs, but that ended up just being due to the 16:9 format chip being wider horizontally the 4:3 format, so the actual sky coverage vertically wasn’t very good.
More importantly, the USB 3.0 functionality was very poor, with the ASCOM driver hanging if you tried take images at less than once a second, and the DirectShow driver crashing constantly. The poor USB 3.0 performance was frustrating since I’d invested in the only USB 3.0 capable stick PC to put in the camera, but moving down to USB 2.0 speeds meant that I could eliminate the computer in the camera housing and just go to a good USB 2.0 extender that worked over the CAT6 cable I have running to the camera location. The DirectShow driver still crashed, but at least I could get something going with ASCOM. However, given the total set of problems, including the poorly represented field of view, I decided to return the ASI290MC. Unfortunately ZWO was still the only solution in this price range, so I ended up moving up to the $369 ASI178MC whose Y dimension of the sensor is about the same 5mm as the X dimension on the ASI290MC. That meant that for a lens that hit the full field of view (FOV) only in the X direction on the ASI290MC, it would produce a full 360 degree FOV on the ASI178MC. And at 3096×2080, the nearly 2x increase in vertical resolution was well worth the difference in price. The camera still had the same driver problems as the ASI290MC, but at least it gave a better FOV with the default lens that came with it.
After some time and several posts on their forum, ZWO did eventually make a fix to the DirectShow driver opening up the option to use iSpy Connect software, which is an open source security camera monitoring software. I’d need to make a few modifications to it to get some of the features I want, but the DirectShow driver has a built-in auto-exposure and automatic gain control loop that will track the broad change in brightness between day and night operation. At the moment thought I’m concentrating on the ASCOM version of a software called AllSkEye, which while it also doesn’t do everything exactly as I’d like, is coming along nicely. Being that it’s targeted specifically to running an all sky camera, the potential there is good and the developer Michael Poelzl has been quite responsive in making updates.
Currently the live view of the camera is available under the webcam link on the weather page. At the moment that points straight to the latest image that is updated once a minute. You just need to refresh the page to get a new image. Eventually I intend to embed that in an HTML wrapper though, to be able to provide an image history, so that’s likely to change. You can find the first light images for the ASI178MC in this gallery, including a couple of all night video animations which are pretty incredible (be sure to watch them full screen). I’ve also created a meteors folder and added a few star trail images. Below are a few highlights.
The design, construction, and installation of the two versions of the camera can be found in this gallery. As mentioned, initially I’d intended to use a stick PC and had planned on a fan cooled housing, but when I decided to go with the USB 2.0 extender, I scaled back those plans. Unfortunately the ASI290MC runs very hot to the point where it was enough to deform the original PLA base piece I’d mounted it to. The sealed dome also had a humidity problem causing condensation on the inner surface of the dome that wouldn’t dissipate despite the higher temperature inside the housing.
This took me back to my original concept for the stick PC solution since I was going to need to ventilate it. To minimize the chances of water/humidity and dust ingress, the plan was to pull the air through the conduit, which would pull it from inside the wall. Ideally the air would be cooler and cleaner than the outside air. Below is the 3D CAD model showing the overall design, with a duct and support for the camera that blows the air up through the middle and circulates it around the camera body, then out past the lens into the dome before it exhausts down the sides past the power supply and USB extender and out the ventilated base. I also switched to an O-ring instead of a gasket for the dome to housing transition.
Below are all the new printed parts and components and a few pictures of the build-up and test fit. Follow the links to the gallery page to view the entire step-by-step assembly.
With all the new parts printed and test-fitted, it’s time to do the install. Again, there are lots more pictures in the gallery.
I hope you enjoyed seeing this build. Keep an eye on the galleries for minor adjustments going forward.
Today I’m introducing another artistic astrophotography processing gallery. Through a rather complicated and painstaking process, it’s possible to convert a standard astrophoto into a 3D image. The actual 3D appearance is artificially created, but it still gives an idea of what the object might look like. The stars are removed from the background nebula (in Photoshop, this is a completely manual process using the spot healing tool to avoid rings around the removed star) and then the new star-less background is subtracted from the original to give the star field itself. After noise reduction, sharpening, and other post processing steps, the two parts of the image are split into left and right eye images. Then one of the two images is adjusted, star by star, to move them further apart or closer together. The goal is to bring the brighter stars to the foreground and varying distances and push things like small galaxies into the background. A separate trick is used to change the perspective on the nebula itself giving a sense of depth in 3D. The result is below.
Try crossing your eyes as you stare at the image above from a couple feet away from your monitor. Eventually you should be able to get it to pop into a 3D view. Note that you’re not really crossing your eyes, but rather trying to get them to look far away but focus on something close. You want your left eye centered on the left image and the right one on the right image but focused close up.
I don’t know how often I’ll go through this process, but I plan to post multiple versions of these images for different viewer approaches. The default is just the two side-by-side images that can be used with the “crossed eyes” focus method, or in a Google Cardboard or other stereo viewer. The second will be with the images squashed to a 2:1 aspect ratio which will allow you to view them on a standard 3D TV (e.g. Samsung) in the side-by-side mode. I’ll also post a red/blue anaglyph version that you can view with retro 3D glasses. Finally, an animated GIF will bounce between the two images giving you a feeling for rotation.
I hope you enjoy this latest addition to the site.
You may notice that there’s a new folder in the gallery for what I refer to as “Astro Art.” About a year ago I had an inspiration for an image of what it might look like to see one of the deep sky objects (DSOs) that I’ve imaged from the surface of a nearby planet or moon. Drawing from my collection of astro and terrestrial photography, I put together my first Astro Art photo of the Whirlpool galaxy above an airless lunar landscape. I have other concepts in mind, but just haven’t had time to proceed with them. However, I decided it was finally time to share what I’d done. I call this piece “Whirlpool Moon.”
I’m treating this artwork a little differently than all the rest of the images on my website. While there are plenty of sources for astrophotography, many better than mine (NASA’s Hubble images come to mind!), this Astro Art is my own unique creation and each piece is something you won’t find anywhere else. Thus, I am not distributing full resolution files, but rather I’m making various high resolution prints available for you to own. While I haven’t yet decided to limit the number of prints sold (I doubt I’ll ever sell enough to warrant that) I am setting the price at a higher markup to reflect my own effort and the value that I place in these items. I hope you will appreciate these unique works of art and I hope to provide images that will amaze and inspire you.
Dr. Michael D. Foegelle
I’ve debated having an option for direct-printing of the various photos on the website here, and while most all of the images can be downloaded full size and printed by individuals, I’ve found an approach that I think will be best. I’ve sourced a print-on-demand shop that can create posters, mugs, T-shirts, and the like and that will integrate directly into my website shop. Rather than just printing anything, I will be developing customized artwork for specific products using the best of what I have available. I’ve put a couple of items online starting with a new eclipse poster and a couple of mugs. I’ll be adding more items as I have time to develop artwork I’m pleased with, but requests are welcome.
I’m setting this up to pass through the price and shipping costs from the printer, with only a nominal markup for my artwork, depending on the item. I still have some work to do with the integration to deal with tax issues and the like, since California and North Carolina residents must pay tax because the printer operates in those states. I also recommend that if you want to order any of my other offerings along with your POD products that you split them into two orders. That will ensure the fastest delivery since dedicated POD orders are processed automatically, while mixed orders require intervention.
I hope you find something you like!