Blog

A blog about astronomy and software development, mainly focused on my experiences during the development of JPARSEC.

Blog

Tag index

Recent Posts

Most recent posts.

Rendering the sky like Stellarium with JPARSEC

There is an old project called Stellarium for Java. It is a port of the old 0.8 version of Stellarium, which is a very popular planetarium due to its high quality, realistic renderings. This project uses the JOGL library to offer a similar experience to the original using Java, but unfortunately it was later abandoned.

I have recently analized the original code to look how this port renders the sky, in particular how it computes the atmosphere brightness and the sky colors considering the position of the Sun or the Moon. I have taken just this code and connected it to the rendering process used in JPARSEC, so that similar renderings can be done (pixel by pixel, so slower) without using the GPU. For this task I have updated JPARSEC with a few improvements (availables from the JPARSEC repository at bitbucket) to improve the horizon rendering with terrain, adding two more more textures.

The idea is to render the sky with JPARSEC using a given texture for the horizon and a transparent image, and reuse the SkyRendering object to compute, pixel by pixel, the illumination and color of the background atmosphere using the formulae from Stellarium. With this you only have to draw the sky image on top of the atmosphere image for the final image. So here are the great results…




 

I leave to you the task of analyzing the code to try to understand the underlying maths and physics, I prefer not to describe it since I'm not an expert, I have just borrowed the code. Here is the rather long Java class that can be used for such renderings in case you want to play with it.

StellariumAtmosphere.java
import static java.lang.StrictMath.PI;
import static java.lang.StrictMath.abs;
import static java.lang.StrictMath.acos;
import static java.lang.StrictMath.asin;
import static java.lang.StrictMath.cos;
import static java.lang.StrictMath.exp;
import static java.lang.StrictMath.log;
import static java.lang.StrictMath.log10;
import static java.lang.StrictMath.pow;
import static java.lang.StrictMath.sin;
import static java.lang.StrictMath.tan;
 
import java.awt.Graphics2D;
import java.awt.image.BufferedImage;
 
import jparsec.astronomy.TelescopeElement;
import jparsec.astronomy.CoordinateSystem.COORDINATE_SYSTEM;
import jparsec.ephem.Ephem;
import jparsec.ephem.EphemerisElement;
import jparsec.ephem.Functions;
import jparsec.ephem.Target.TARGET;
import jparsec.ephem.planets.EphemElement;
import jparsec.graph.chartRendering.AWTGraphics;
import jparsec.graph.chartRendering.Graphics;
import jparsec.graph.chartRendering.PlanetRenderElement;
import jparsec.graph.chartRendering.SkyRenderElement;
import jparsec.graph.chartRendering.Projection.PROJECTION;
import jparsec.graph.chartRendering.RenderPlanet;
import jparsec.graph.chartRendering.SkyRenderElement.HORIZON_TEXTURE;
import jparsec.graph.chartRendering.SkyRenderElement.LEYEND_POSITION;
import jparsec.graph.chartRendering.SkyRenderElement.MILKY_WAY_TEXTURE;
import jparsec.graph.chartRendering.SkyRenderElement.REALISTIC_STARS;
import jparsec.graph.chartRendering.frame.SkyRendering;
import jparsec.io.ConsoleReport;
import jparsec.io.image.Picture;
import jparsec.math.Constant;
import jparsec.math.FastMath;
import jparsec.observer.City;
import jparsec.observer.LocationElement;
import jparsec.observer.ObserverElement;
import jparsec.time.AstroDate;
import jparsec.time.TimeElement;
 
public class StellariumAtmosphere {
 
	public static SkyRendering render;
 
	// Returns an image showing the sky for a given fixed set of properties: observer, appearance, and so on
	public static BufferedImage getStarsImage() throws Exception {        	
    	// Parameters for the rendering: geographical location and elevation, time zone,
    	// camera pointing direction and inclination, sensor size in pixels, field of view, ...
    	String locName = "Madrid";
    	double cameraAcimut = 90 * Constant.DEG_TO_RAD; // 0 = North
    	double cameraElevation = 50 * Constant.DEG_TO_RAD; // 0 = horizon, 90 = zenith
    	double cameraInclination = 0 * Constant.DEG_TO_RAD;
    	int cameraPixelX = 1200, cameraPixelY = 800;
    	double fov = 120 * Constant.DEG_TO_RAD;
    	TimeElement.SCALE ts = TimeElement.SCALE.LOCAL_TIME;
    	AstroDate astro = new AstroDate(2018, 2, 19, 10, 0, 0);
    	TimeElement date = new TimeElement(astro, ts);
 
    	EphemerisElement eph = new EphemerisElement(TARGET.SUN,
    		EphemerisElement.COORDINATES_TYPE.APPARENT, EphemerisElement.EQUINOX_OF_DATE,
    		EphemerisElement.TOPOCENTRIC, EphemerisElement.REDUCTION_METHOD.IAU_2006,
    		EphemerisElement.FRAME.DYNAMICAL_EQUINOX_J2000,
    		EphemerisElement.ALGORITHM.MOSHIER);
    	eph.optimizeForSpeed();
    	ObserverElement observer = ObserverElement.parseCity(City.findCity(locName));
 
    	SkyRenderElement sky = getSky(cameraAcimut, cameraElevation, cameraPixelX, cameraPixelY, fov, cameraInclination);
 
	sky.drawMilkyWayContoursWithTextures = MILKY_WAY_TEXTURE.OPTICAL;
	EphemElement ephemSun = Ephem.getEphemeris(date, observer, eph, true);
	if (ephemSun.elevation > -5 * Constant.DEG_TO_RAD) sky.drawMilkyWayContoursWithTextures = MILKY_WAY_TEXTURE.NO_TEXTURE;
	sky.drawMilkyWayContours = sky.drawMilkyWayContoursWithTextures == MILKY_WAY_TEXTURE.NO_TEXTURE ? false : true;
	int yMargin = 0;
 
	// These lines are used to change the vertical position of the horizon, to create panoramas
//		sky.height = sky.width/2+150;
//		yMargin = sky.height/2-100;
 
    	render = new SkyRendering(date, observer, eph, sky, locName, yMargin);  
    	// JPARSEC supports transparency using these two lines
    	AWTGraphics.setBufferedImageType(BufferedImage.TYPE_INT_ARGB);
    	BufferedImage out = render.createTransparentBufferedImage();
    	return out;
    }
 
    public static BufferedImage getSkyBrightness() throws Exception {
	EphemElement ephemSun = render.getRenderSkyObject().calcPlanet(TARGET.SUN, true, true);
	EphemElement ephemMoon = render.getRenderSkyObject().calcPlanet(TARGET.Moon, true, false);
 
	ConsoleReport.fullEphemReportToConsole(ephemSun);
 
	ToneReproductor eye = new ToneReproductor();
	StellariumAtmosphere atmosphere = new StellariumAtmosphere();
	float atmBr = 0.0001f, turbidity = 1f, atmI = 1000f;
	if (ephemSun.elevation < 0) {
		double elev = Math.min(18, Math.abs(ephemSun.elevation * Constant.RAD_TO_DEG));
		atmI = (float) (1000 + 5000 * elev / 15.0);
		//turbidity = (float) (2 - 1 * Math.abs(ephemSun.elevation * Constant.RAD_TO_DEG) / 15.0);
	}
        eye.setWorldAdaptationLuminance(3.75f + atmBr * 40000.f);
        atmosphere.atmIntensity = 255 * atmBr * atmI;			
 
        int year = render.getTimeObject().astroDate.getYear(), month = render.getTimeObject().astroDate.getMonth();
        double latitude = render.getObserverObject().getLatitudeDeg(), altitude = render.getObserverObject().getHeight(), 
        		temperature = render.getObserverObject().getTemperature(), relativeHumidity = render.getObserverObject().getHumidity();
        double moonPhase = Math.PI - ephemMoon.elongation;
 
        LocationElement sunLoc = new LocationElement(ephemSun.azimuth, ephemSun.elevation, 1);
        LocationElement moonLoc = new LocationElement(ephemMoon.azimuth, ephemMoon.elevation, 1);
        LocationElement zenLoc = new LocationElement(0, Constant.PI_OVER_TWO, 1);
	double moonZenD = LocationElement.getAngularDistance(moonLoc, zenLoc);
	double sunZenD = LocationElement.getAngularDistance(sunLoc, zenLoc);
 
	atmosphere.skyLight.setParams((float) sunZenD, turbidity);
	atmosphere.skyBright.setLoc(Math.toRadians(latitude), altitude, temperature, relativeHumidity);
	atmosphere.skyBright.setSunMoon(Math.cos(moonZenD), Math.cos(sunZenD));
 
	atmosphere.skyBright.setDate(year, month, moonPhase);
 
	int w = render.getRenderSkyObject().render.width;
	int h = render.getRenderSkyObject().render.height;
	BufferedImage out = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
	int black = (new java.awt.Color(0, 0, 0)).getRGB();
	for (int x = 0; x < w; ++x) {
	for (int y = 0; y < h; ++y) {
		LocationElement skyLoc = render.getRenderSkyObject().getSkyLocation(x, y);
		if (skyLoc.getLatitude() <= 0) {
			out.setRGB(x, y, black);
			continue;
		}
		double sunD = LocationElement.getApproximateAngularDistance(skyLoc, sunLoc); 
		double moonD = LocationElement.getApproximateAngularDistance(skyLoc, moonLoc); 
		double zenD = LocationElement.getApproximateAngularDistance(skyLoc, zenLoc); 
 
		double cosDistSun = FastMath.cos(sunD), cosDistZen = FastMath.cos(zenD);
		float color[] = atmosphere.skyLight.get_xyY_valuev(cosDistSun, 1.0 / cosDistZen);
		color[2] = (float) atmosphere.skyBright.getLuminance(FastMath.cos(moonD), 
			cosDistSun, cosDistZen);
 
		eye.xyYToRGB(color);
 
		int r = (int) (atmosphere.atmIntensity * color[0]);
		int g = (int) (atmosphere.atmIntensity * color[1]);
		int b = (int) (atmosphere.atmIntensity * color[2]);
		if (r > 255) r = 255;
		if (g > 255) g = 255;
		if (b > 255) b = 255;
		if (r < 0) r = 0;
		if (g < 0) g = 0;
		if (b < 0) b = 0;
		int rgb = (new java.awt.Color(r, g, b)).getRGB();
		out.setRGB(x, y, rgb);
	}
	}
	return out;
    }
 
    // Returns the sky render object containing all properties to define the rendering appearance
    public static SkyRenderElement getSky(double acimut, double elevation, int width, int height, double fov, double incl) {
    	PlanetRenderElement planet = new PlanetRenderElement(false, true, true, false);
	SkyRenderElement sky = new SkyRenderElement(COORDINATE_SYSTEM.HORIZONTAL,
		PROJECTION.STEREOGRAPHICAL, acimut, elevation, width, height, planet, 
		TelescopeElement.OBJECTIVE_50mm_f1_4);
	sky.telescope.ocular.focalLength = TelescopeElement.getOcularFocalLengthForCertainField(fov, sky.telescope);
	sky.drawObjectsLimitingMagnitude = 3f;        
	sky.drawStarsLimitingMagnitude = 5.0f;
	sky.drawPlanetsMoonSun = true;
	sky.drawSkyCorrectingLocalHorizon = true;        
	sky.drawSkyBelowHorizon = false;        
	sky.drawFastLabels = SkyRenderElement.SUPERIMPOSED_LABELS.AVOID_SUPERIMPOSING_ACCURATE;        
	sky.drawFastLabelsInWideFields = false;        
	sky.drawConstellationLimits = false;
	sky.drawLeyend = LEYEND_POSITION.TOP;
	sky.drawExternalGrid = false;
	sky.drawCoordinateGrid = false;
	sky.drawDeepSkyObjectsAllMessierAndCaldwell = false;
	sky.drawStarsRealistic = REALISTIC_STARS.SPIKED;
	sky.drawMeteorShowers = false;
	sky.drawNebulaeContours = false;
	sky.drawComets = sky.drawAsteroids = false;
	sky.drawConstellationNames = false;
	sky.poleAngle = (float) incl;
	sky.drawStarsLabelsLimitingMagnitude = 1.1f;
	sky.drawPlanetsMoonSun = true;
	sky.drawPlanetsLabels = true;
	sky.drawIcons = false;
	sky.drawSpaceProbes = false;
	sky.drawDeepSkyObjects = false;
	sky.drawConstellationNames = true;
	//sky.drawConstellationNamesType = CONSTELLATION_NAME.SPANISH;
	sky.drawConstellationNamesFont = Graphics.FONT.SANS_SERIF_ITALIC_22;
	sky.drawPlanetsNamesFont = Graphics.FONT.SANS_SERIF_BOLD_16;
	sky.drawStarsNamesFont = Graphics.FONT.DIALOG_PLAIN_12;
	sky.drawDeepSkyObjectsTextures = false;
	sky.drawCoordinateGridEclipticLabels = false;
	sky.planetRender.textures = false;
	sky.drawMilkyWayContoursWithTextures = MILKY_WAY_TEXTURE.OPTICAL;
	sky.drawMilkyWayContours = sky.drawMilkyWayContoursWithTextures == MILKY_WAY_TEXTURE.NO_TEXTURE ? false : true;
	sky.drawHorizonTexture = HORIZON_TEXTURE.VELETA_30m;
	sky.drawLeyend = LEYEND_POSITION.NO_LEYEND;
	sky.planetRender.highQuality = true;
	RenderPlanet.MAXIMUM_TEXTURE_QUALITY_FACTOR = 4f;
 
	sky.setColorMode(SkyRenderElement.COLOR_MODE.BLACK_BACKGROUND);
	sky.background = Functions.getColor(0, 0, 0, 0);
	return sky;
    }
 
    private Skylight skyLight = new Skylight();
 
    private SkyBright skyBright = new SkyBright();
 
    double worldAdaptationLuminance;
 
    double milkywayAdaptationLuminance;
 
    private float atmIntensity;
 
    public static void main(String args[]) {
    	try {
        	//Translate.setDefaultLanguage(LANGUAGE.SPANISH);
 
    		BufferedImage sky = getStarsImage();
    		Picture pic = new Picture(getSkyBrightness());
    		Graphics2D g = pic.getImage().createGraphics();
    		AWTGraphics.enableAntialiasing(g);
    		g.drawImage(sky, 0, 0, null);
    		g.dispose();
    		pic.show("");
    		pic.write("/home/alonso/test.png");
 
    	} catch (Exception exc) {
    		exc.printStackTrace();
    	}
    }
}
 
class ToneReproductor {
 
   public ToneReproductor() {
        lda = 50.f;
        lwa = 40000.f;
        maxDL = 100.f;
        gamma = 2.3f;
 
        // Update alpha_da and beta_da values
        double log10Lwa = log10(lwa);
        alphaWa = 0.4f * log10Lwa + 1.519f;
        betaWa = -0.4f * log10Lwa * log10Lwa + 0.218f * log10Lwa + 6.1642f;
 
        setDisplayAdaptationLuminance(lda);
        setWorldAdaptationLuminance(lwa);
    }
 
    /**
     * Set the eye adaptation luminance for the display and precompute what can be
     * Usual luminance range is 1-100 cd/m^2 for a CRT screen
     *
     * @param _Lda
     */
    void setDisplayAdaptationLuminance(double _Lda) {
        lda = _Lda;
 
        // Update alpha_da and beta_da values
        double log10Lda = log10(lda);
        alphaDa = 0.4f * log10Lda + 1.519f;
        betaDa = -0.4f * log10Lda * log10Lda + 0.218f * log10Lda + 6.1642f;
 
        // Update terms
        alphaWaOverAlphaDa = alphaWa / alphaDa;
        term2 = pow(10.f, (betaWa - betaDa) / alphaDa) / (PI * 0.0001f);
    }
 
    /**
     * Set the eye adaptation luminance for the world and precompute what can be
     *
     * @param _Lwa
     */
    public void setWorldAdaptationLuminance(double _Lwa) {
        lwa = _Lwa;
 
        // Update alpha_da and beta_da values
        double log10Lwa = log10(lwa);
        alphaWa = 0.4f * log10Lwa + 1.519f;
        betaWa = -0.4f * log10Lwa * log10Lwa + 0.218f * log10Lwa + 6.1642f;
 
        // Update terms
        alphaWaOverAlphaDa = alphaWa / alphaDa;
        term2 = pow(10.f, (betaWa - betaDa) / alphaDa) / (PI * 0.0001f);
 
    }
 
    /**
     * Convert from xyY color system to RGB according to the adaptation
     * The Y component is in cd/m^2
     */
    public void xyYToRGB(float[] color) {
        // TODO: Fred the parameter should an SColor object
        // 1. Hue conversion
        float log10Y = (float) log10(color[2]);
        // if log10Y>0.6, photopic vision only (with the cones, colors are seen)
        // else scotopic vision if log10Y<-2 (with the rods, no colors, everything blue),
        // else mesopic vision (with rods and cones, transition state)
        if (log10Y < 0.6) {
            // Compute s, ratio between scotopic and photopic vision
            float s = 0.f;
            if (log10Y > -2.f) {
                float op = (log10Y + 2.f) / 2.6f;
                s = 3.f * op * op - 2 * op * op * op;
            }
 
            // Do the blue shift for scotopic vision simulation (night vision) [3]
            // The "night blue" is x,y(0.25, 0.25)
            color[0] = (1.f - s) * 0.25f + s * color[0];// Add scotopic + photopic components
            color[1] = (1.f - s) * 0.25f + s * color[1];// Add scotopic + photopic components
 
            // Take into account the scotopic luminance approximated by V [3] [4]
            double V = color[2] * (1.33f * (1.f + color[1] / color[0] + color[0] * (1.f - color[0] - color[1])) - 1.68f);
            color[2] = (float) (0.4468f * (1.f - s) * V + s * color[2]);
        }
 
        // 2. Adapt the luminance value and scale it to fit in the RGB range [2]
        color[2] = (float) pow(adaptLuminance(color[2]) / maxDL, 1.d / gamma);
 
        // Convert from xyY to XZY
        double X = color[0] * color[2] / color[1];
        double Y = color[2];
        double Z = (1.f - color[0] - color[1]) * color[2] / color[1];
 
        // Use a XYZ to Adobe RGB (1998) matrix which uses a D65 reference white
        color[0] = (float) (2.04148f * X - 0.564977f * Y - 0.344713f * Z);
        color[1] = (float) (-0.969258f * X + 1.87599f * Y + 0.0415557f * Z);
        color[2] = (float) (0.0134455f * X - 0.118373f * Y + 1.01527f * Z);
    }
 
    /**
     * Set the maximum display luminance : default value = 100 cd/m^2
     * This value is used to scale the RGB range
     */
    void set_max_display_luminance(float _maxdL) {
        maxDL = _maxdL;
    }
 
    /**
     * Set the display gamma : default value = 2.3
     */
    void setDisplayGamma(float _gamma) {
        gamma = _gamma;
    }
 
    /**
     * Return adapted luminance from world to display
     */
    public double adaptLuminance(double worldLluminance) {
        return pow(worldLluminance * PI * 0.0001d, alphaWaOverAlphaDa) * term2;
    }
 
    private double lda;// Display luminance adaptation (in cd/m^2)
 
    double lwa;// World   luminance adaptation (in cd/m^2)
 
    double maxDL;// Display maximum luminance (in cd/m^2)
 
    double gamma;// Screen gamma value
 
    // Precomputed variables
    double alphaDa;
 
    double betaDa;
 
    double alphaWa;
 
    double betaWa;
 
    double alphaWaOverAlphaDa;
 
    double term2;
 
}
 
class Skylight {
    public class SkylightStruct {
        double zenith_angle;// zenith_angle : angular distance to the zenith in radian
 
        double dist_sun;// dist_sun     : angular distance to the sun in radian
 
        double color[] = new double[3];// 3 component color, can be RGB or CIE color system
    }
 
    public static class SkylightStruct2 {
        public double pos[] = new double[3];// Vector to the position (vertical = pos[2])
 
        public float color[] = new float[3];// 3 component color, can be RGB or CIE color system
    }
 
    public Skylight() {
    }
 
    void setParams(float _sun_zenith_angle, float _turbidity) {
        // Set the two Main variables
        thetas = _sun_zenith_angle;
        T = _turbidity;
 
        // Precomputation of the distribution coefficients and zenith luminances/color
        compute_zenith_luminance();
        compute_zenith_color();
        compute_luminance_distribution_coefs();
        compute_color_distribution_coefs();
 
        // Precompute everything possible to increase the get_CIE_value() function speed
        double cos_thetas = cos(thetas);
        termX = (float) (zenithColorX / ((1.f + aX * exp(bX)) * (1.f + cX * exp(dX * thetas) + eX * cos_thetas * cos_thetas)));
        termY = (float) (zenith_color_y / ((1.f + Ay * exp(By)) * (1.f + Cy * exp(Dy * thetas) + Ey * cos_thetas * cos_thetas)));
        term_Y = (float) (zenith_luminance / ((1.f + AY * exp(BY)) * (1.f + CY * exp(DY * thetas) + EY * cos_thetas * cos_thetas)));
 
    }
 
    public void setParamsV(float[] _sun_pos, float _turbidity) {
        // Store sun position
        sunPos[0] = _sun_pos[0];
        sunPos[1] = _sun_pos[1];
        sunPos[2] = _sun_pos[2];
 
        // Set the two Main variables
        thetas = (float) (PI / 2 - asin(sunPos[2]));
        T = _turbidity;
 
        // Precomputation of the distribution coefficients and zenith luminances/color
        compute_zenith_luminance();
        compute_zenith_color();
        compute_luminance_distribution_coefs();
        compute_color_distribution_coefs();
 
        // Precompute everything possible to increase the get_CIE_value() function speed
        double cos_thetas = sunPos[2];
        termX = (float) (zenithColorX / ((1.f + aX * exp(bX)) * (1.f + cX * exp(dX * thetas) + eX * cos_thetas * cos_thetas)));
        termY = (float) (zenith_color_y / ((1.f + Ay * exp(By)) * (1.f + Cy * exp(Dy * thetas) + Ey * cos_thetas * cos_thetas)));
        term_Y = (float) (zenith_luminance / ((1.f + AY * exp(BY)) * (1.f + CY * exp(DY * thetas) + EY * cos_thetas * cos_thetas)));
    }
 
    /**
     * Compute CIE luminance for zenith in cd/m^2
     */
    void compute_zenith_luminance() {
        zenith_luminance = (float) (1000.f * ((4.0453f * T - 4.9710f) * tan((0.4444f - T / 120.f) * (PI - 2.f * thetas)) -
                0.2155f * T + 2.4192f));
        if (zenith_luminance <= 0.f) zenith_luminance = 0.00000000001f;
    }
 
    /**
     * Compute CIE x and y color components
     */
    void compute_zenith_color() {
        thetas2 = thetas * thetas;
        thetas3 = thetas2 * thetas;
        T2 = T * T;
 
        zenithColorX = (0.00166f * thetas3 - 0.00375f * thetas2 + 0.00209f * thetas) * T2 +
                (-0.02903f * thetas3 + 0.06377f * thetas2 - 0.03202f * thetas + 0.00394f) * T +
                (0.11693f * thetas3 - 0.21196f * thetas2 + 0.06052f * thetas + 0.25886f);
 
        zenith_color_y = (0.00275f * thetas3 - 0.00610f * thetas2 + 0.00317f * thetas) * T2 +
                (-0.04214f * thetas3 + 0.08970f * thetas2 - 0.04153f * thetas + 0.00516f) * T +
                (0.15346f * thetas3 - 0.26756f * thetas2 + 0.06670f * thetas + 0.26688f);
 
    }
 
    /**
     * Compute the luminance distribution coefficients
     */
    void compute_luminance_distribution_coefs() {
        AY = 0.1787f * T - 1.4630f;
        BY = -0.3554f * T + 0.4275f;
        CY = -0.0227f * T + 5.3251f;
        DY = 0.1206f * T - 2.5771f;
        EY = -0.0670f * T + 0.3703f;
    }
 
    /**
     * Compute the color distribution coefficients
     */
    void compute_color_distribution_coefs() {
        aX = -0.0193f * T - 0.2592f;
        bX = -0.0665f * T + 0.0008f;
        cX = -0.0004f * T + 0.2125f;
        dX = -0.0641f * T - 0.8989f;
        eX = -0.0033f * T + 0.0452f;
 
        Ay = -0.0167f * T - 0.2608f;
        By = -0.0950f * T + 0.0092f;
        Cy = -0.0079f * T + 0.2102f;
        Dy = -0.0441f * T - 1.6537f;
        Ey = -0.0109f * T + 0.0529f;
    }
 
    /**
     * Compute the sky color at the given position in the CIE color system and store it in p.color
     * p.color[0] is CIE x color component
     * p.color[1] is CIE y color component
     * p.color[2] is CIE Y color component (luminance)
     */
    void get_xyY_value(SkylightStruct p) {
        double cos_dist_sun = cos(p.dist_sun);
        double one_over_cos_zenith_angle = 1.f / cos(p.zenith_angle);
        p.color[0] = termX * (1.f + aX * exp(bX * one_over_cos_zenith_angle)) * (1.f + cX * exp(dX * p.dist_sun) +
                eX * cos_dist_sun * cos_dist_sun);
        p.color[1] = termY * (1.f + Ay * exp(By * one_over_cos_zenith_angle)) * (1.f + Cy * exp(Dy * p.dist_sun) +
                Ey * cos_dist_sun * cos_dist_sun);
        p.color[2] = term_Y * (1.f + AY * exp(BY * one_over_cos_zenith_angle)) * (1.f + CY * exp(DY * p.dist_sun) +
                EY * cos_dist_sun * cos_dist_sun);
    }
 
    /**
     * Compute the sky color at the given position in the CIE color system and store it in p.color
     * p.color[0] is CIE x color component
     * p.color[1] is CIE y color component
     * p.color[2] is CIE Y color component (luminance)
     */
    public void get_xyY_valuev(SkylightStruct2 p) {
        //	if (p.pos[2]<0.)
        //	{
        //		p.color[0] = 0.25;
        //		p.color[1] = 0.25;
        //		p.color[2] = 0;
        //		return;
        //	}
 
        double cosDistSun = sunPos[0] * (p.pos[0]) + sunPos[1] * (p.pos[1]) + sunPos[2] * (p.pos[2]) - 0.0000001f;
        double oneOverCosZenithAngle = 1.f / p.pos[2];
        float distSun = (float) acos(cosDistSun);
 
        p.color[0] = (float) (termX * (1.f + aX * exp(bX * oneOverCosZenithAngle)) * (1.f + cX * exp(dX * distSun) +
                eX * cosDistSun * cosDistSun));
        p.color[1] = (float) (termY * (1.f + Ay * exp(By * oneOverCosZenithAngle)) * (1.f + Cy * exp(Dy * distSun) +
                Ey * cosDistSun * cosDistSun));
        p.color[2] = (float) (term_Y * (1.f + AY * exp(BY * oneOverCosZenithAngle)) * (1.f + CY * exp(DY * distSun) +
                EY * cosDistSun * cosDistSun));
 
        if (p.color[2] < 0 || p.color[0] < 0 || p.color[1] < 0) {
            p.color[0] = 0.25f;
            p.color[1] = 0.25f;
            p.color[2] = 0;
        }
    }
 
    /**
     * Compute the sky color at the given position in the CIE color system and store it in p.color
     * p.color[0] is CIE x color component
     * p.color[1] is CIE y color component
     * p.color[2] is CIE Y color component (luminance)
     */
    public float[] get_xyY_valuev(double cosDistSun, double oneOverCosZenithAngle) {
        //	if (p.pos[2]<0.)
        //	{
        //		p.color[0] = 0.25;
        //		p.color[1] = 0.25;
        //		p.color[2] = 0;
        //		return;
        //	}
 
        float distSun = (float) FastMath.acos(cosDistSun);
 
        float color[] = new float[3];
        color[0] = (float) (termX * (1.f + aX * exp(bX * oneOverCosZenithAngle)) * (1.f + cX * exp(dX * distSun) +
                eX * cosDistSun * cosDistSun));
        color[1] = (float) (termY * (1.f + Ay * exp(By * oneOverCosZenithAngle)) * (1.f + Cy * exp(Dy * distSun) +
                Ey * cosDistSun * cosDistSun));
        color[2] = (float) (term_Y * (1.f + AY * exp(BY * oneOverCosZenithAngle)) * (1.f + CY * exp(DY * distSun) +
                EY * cosDistSun * cosDistSun));
 
        if (color[2] < 0 || color[0] < 0 || color[1] < 0) {
            color[0] = 0.25f;
            color[1] = 0.25f;
            color[2] = 0;
        }
        return color;
    }
 
    /**
     * Return the current zenith color in xyY color system
     */
    void get_zenith_color(double[] v) {
        v[0] = zenithColorX;
        v[1] = zenith_color_y;
        v[2] = zenith_luminance;
    }
 
    private float thetas;// angular distance between the zenith and the sun in radian
 
    float T;// Turbidity : i.e. sky "clarity"
 
    //  1 : pure air
    //  2 : exceptionnally clear
    //  4 : clear
    //  8 : light haze
    // 25 : haze
    // 64 : thin fog
 
    // Computed variables depending on the 2 above
 
    float zenith_luminance;// Y color component of the CIE color at zenith (luminance)
 
    float zenithColorX;// x color component of the CIE color at zenith
 
    float zenith_color_y;// y color component of the CIE color at zenith
 
    double eye_lum_conversion;// luminance conversion for an eye adapted to screen luminance (around 40 cd/m^2)
 
    double AY, BY, CY, DY, EY;// Distribution coefficients for the luminance distribution function
 
    float aX, bX, cX, dX, eX;// Distribution coefficients for x distribution function
 
    double Ay, By, Cy, Dy, Ey;// Distribution coefficients for y distribution function
 
    float termX;// Precomputed term for x calculation
 
    float termY;// Precomputed term for y calculation
 
    float term_Y;// Precomputed term for luminance calculation
 
    float sunPos[] = new float[3];
 
    static float thetas2;
 
    static float thetas3;
 
    static float T2;
}
 
class SkyBright {
    public SkyBright() {
        setDate(2003, 8, 0);
        setLoc(Constant.PI_OVER_FOUR, 1000.d, 25.d, 40.d);
        setSunMoon(0.5, 0.5);
    }
 
    /**
     * @param year
     * @param month     1=Jan, 12=Dec
     * @param moonPhase in radian 0=Full Moon, PI/2=First Quadrant/Last Quadran, PI=No Moon
     */
    public void setDate(int year, int month, double moonPhase) {
        magMoon = -12.73d + 1.4896903d * abs(moonPhase) + 0.04310727d * pow(moonPhase, 4.d);
 
        RA = (month - 3.d) * 0.52359878d;
 
        // Term for dark sky brightness computation
        bNightTerm = 1.0e-13 + 0.3e-13 * cos(0.56636d * (year - 1992.d));
    }
 
 
    public void setLoc(double latitude, double altitude, double temperature, double relativeHumidity) {
        double signLatitude = (latitude >= 0.d) ? 2.d - 1.d : 0 - 1.d;
 
        // extinction Coefficient for V band
        double KR = 0.1066d * exp(-altitude / 8200.d);
        double KA = 0.1d * exp(-altitude / 1500.d) * pow(1.d - 0.32d / log(relativeHumidity / 100.d), 1.33d) *
                (1.d + 0.33d * signLatitude * sin(RA));
        double KO = 0.031d * (3.d + 0.4d * (latitude * cos(RA) - cos(3.d * latitude))) / 3.d;
        double KW = 0.031d * 0.94d * (relativeHumidity / 100.d) * exp(temperature / 15.d) * exp(-altitude / 8200.d);
        K = KR + KA + KO + KW;
    }
 
    /**
     * Set the moon and sun zenith angular distance (cosin given) and precompute what can be
     *
     * @param cosDistMoonZenith
     * @param cosDistSunZenith
     */
    public void setSunMoon(double cosDistMoonZenith, double cosDistSunZenith) {
        // Air mass for Moon
        if (cosDistMoonZenith < 0) airMassMoon = 40.f;
        else airMassMoon = 1.f / (cosDistMoonZenith + 0.025f * exp(-11.f * cosDistMoonZenith));
 
        // Air mass for Sun
        if (cosDistSunZenith < 0) airMassSun = 40;
        else airMassSun = 1.f / (cosDistSunZenith + 0.025f * exp(-11.f * cosDistSunZenith));
 
        bMoonTerm1 = pow(10.f, -0.4 * (magMoon + 54.32f));
 
        C3 = pow(10.f, -0.4f * K * airMassMoon);// Term for moon brightness computation
 
        bTwilightTerm = -6.724f + 22.918312f * (Constant.PI_OVER_TWO - acos(cosDistSunZenith));
 
        C4 = pow(10.f, -0.4f * K * airMassSun);// Term for sky brightness computation
    }
 
    /**
     * Compute the luminance at the given position
     *
     * @param cosDistMoon cos(angular distance between moon and the position)
     * @param cosDistSun  cos(angular distance between sun  and the position)
     * @param cosDistZenithcos(angulardistancebetweenzenithandtheposition)
     *
     * @return
     */
    public double getLuminance(double cosDistMoon, double cosDistSun, double cosDistZenith) {
        // catch rounding errors here or end up with white flashes in some cases
        if (cosDistMoon < -1.d) cosDistMoon = -1.d;
        if (cosDistMoon > 1.d) cosDistMoon = 1.d;
        if (cosDistSun < -1.d) cosDistSun = -1.d;
        if (cosDistSun > 1.d) cosDistSun = 1.d;
        if (cosDistZenith < -1.d) cosDistZenith = -1.d;
        if (cosDistZenith > 1.d) cosDistZenith = 1.d;
 
        double distMoon = FastMath.acos(cosDistMoon);
        double distSun = FastMath.acos(cosDistSun);
 
        // Air mass
        double X = 1.d / (cosDistZenith + 0.025f * FastMath.exp(-11.d * cosDistZenith));
        double bKX = pow(10.d, -0.4f * K * X);
 
        // Dark night sky brightness
        bNight = 0.4f + 0.6f / FastMath.sqrt(0.04f + 0.96f * cosDistZenith * cosDistZenith);
        bNight *= bNightTerm * bKX;
 
        // Moonlight brightness
        double FM = 18886.28 / (distMoon * distMoon + 0.0007f) + pow(10.d, 6.15f - (distMoon + 0.001) * 1.43239f);
        FM += 229086.77f * (1.06f + cosDistMoon * cosDistMoon);
        bMoon = bMoonTerm1 * (1.d - bKX) * (FM * C3 + 440000.d * (1.d - C3));
 
        //Twilight brightness
        bTwilight = pow(10.d, bTwilightTerm + 0.063661977f * FastMath.acos(cosDistZenith) / K) *
                (1.7453293f / distSun) * (1.d - bKX);
 
        // Daylight brightness
        double FS = 18886.28f / (distSun * distSun + 0.0007f) + pow(10.d, 6.15f - (distSun + 0.001) * 1.43239f);
        FS += 229086.77f * (1.06f + cosDistSun * cosDistSun);
        bDaylight = 9.289663e-12 * (1.d - bKX) * (FS * C4 + 440000.d * (1.d - C4));
 
        // 27/08/2003 : Decide increase moonlight for more halo effect...
        bMoon *= 2.;
 
        // Total sky brightness
        bTotal = bDaylight > bTwilight ? bNight + bTwilight + bMoon : bNight + bDaylight + bMoon;
 
        return (bTotal < 0.d) ? 0.d : bTotal * 900900.9f * PI * 1e-4 * 3239389 * 2;
        //5;	// In cd/m^2 : the 32393895 is empirical term because the
        // lambert -> cd/m^2 formula seems to be wrong...
    }
 
    /*
250 REM  Visual limiting magnitude
260 BL=B(3)/1.11E-15 : REM in nanolamberts*/
 
    // Airmass for each component
    //cos_dist_zenith =cos(dist_zenith);
    //double gaz_mass = 1.f / ( cos_dist_zenith + 0.0286f *exp(-10.5f * cos_dist_zenith) );
    //double aerosol_mass = 1.f / ( cos_dist_zenith + 0.0123f *exp(-24.5f * cos_dist_zenith) );
    //double ozone_mass = 1.f /sqrt( 0.0062421903f - cos_dist_zenith * cos_dist_zenith / 1.0062814f );
    // Total extinction for V band
    //double DM = KR*gaz_mass + KA*aerosol_mass + KO*ozone_mass + KW*gaz_mass;
 
    /*
	// Visual limiting magnitude
	if (BL>1500.0)
	{
		C1 = 4.466825e-9;
		C2 = 1.258925e-6;
	}
	else
	{
		C1 = 1.584893e-10;
		C2 = 0.012589254;
	}
 
	double TH = C1*Math.pow(1.f+Math.sqrt(C2*BL),2.f); // in foot-candles
	double MN = -16.57-2.5*Math.log10(TH)-DM+5.0*Math.log10(SN); // Visual Limiting Magnitude
	*/
 
    /**
     * Air mass for the Moon
     */
    private double airMassMoon;
 
    /**
     * Air mass for the Sun
     */
    double airMassSun;
 
    /**
     * Total brightness
     */
    double bTotal;
 
    /**
     * Dark night brightness
     */
    double bNight;
 
    /**
     * Twilight brightness
     */
    double bTwilight;
 
    /**
     * Daylight sky brightness
     */
    double bDaylight;
 
    /**
     * Moon brightness
     */
    double bMoon;
 
    /**
     * Moon magnitude
     */
    double magMoon;
 
    /**
     * Something related with date
     */
    double RA;
 
    /**
     * Useful coef...
     */
    double K;
 
    /**
     * Term for moon brightness computation
     */
    double C3;
 
    /**
     * Term for sky brightness computation
     */
    double C4;
 
    /**
     * Snellen Ratio (20/20=1.0, good 20/10=2.0)
     */
    double SN = 1;
 
    // Optimisation variables
    double bNightTerm;
 
    double bMoonTerm1;
 
    double bTwilightTerm;
}

The method getSkyBrightness contains some variables that can be adapted in case you want to exagerate the brightness (atmBr, atmI) or the red glow close to the horizon (turbidity). I have set them in a fast way after a few tests to try to use acceptable values depending on the Sun elevation.

These kind of images are already being used for showing the sky during the day at the web page of the OAN ephemerides server.

2018/02/19 16:10 · Tomás Alonso Albi

Using GPhoto for move detection

This is my first blog entry in more than two years. I've been very busy with my Android planetarium project, now almost finished, but probably needed also some rest. Despite of that during this time I have developed a number of little nice projects, and waited for the right moment to resume my blog activity and release some of them. As ussual they are not oriented to specific goals, going from 3d renderings to solving complex and time consuming tasks in a completely automatic manner, or even to criptocurrencies.

In this post I present a little program aimed to detecting move and triggering shots with a camera automatically. It is based on the gphoto library, and uses the gphoto binding implemented in the JPARSEC library. Some features of the binding have been improved or developed to support this use case. For instance, the live view mode didn't support executing other commands except those to capture previews, and it wasn't possible to execute this mode without showing a panel object with those previews.

The program provides many (31) configuration options in a file named config.txt. Among others, it is possible to set some custom parameters for the camera at startup (ISO, shutter, aperture, among others) and right before the program ends, adjust the sensitivity of the move detection algorithm, configure masks the search for move only in certain areas, or to uppload the shots to a server.

My intenction is to use this program for security, shooting when move is detected with a high resolution camera instead of a webcam. After thinking from time to time on this a saw a post in the gphoto mail list from Alan Corey, asking for a way to do something similar for wild photography. So I decided to start writing the program and Alan collaborated testing it in his cameras. In fact, the program has been tested successfully in a Canon 40D, a Nikon D5200, and a Canon Powershot S70 and A520. It has also been tested in a variety of CPU devices, from a powerful desktop to a Raspberry PI 3. Here I will describe the basics of this program, some other details can be found in the gphoto mailing list and previous messages about the same subject.

The move detection algorithm is very simple. It searches for move in two ways: detecting a global change in the luminosity of the preview image, and detecting changes in the brightness level of the pixels. The preview image is internally converted to gray scale since the color data is not required to detect move. The global change is computed from the difference between the histograms of the preview image and a reference preview image taken when the program starts. This reference image should be free of movement, and is optionally updated in a given time interval to account for possible brightness changes due to the sky or the weather. The histogram difference is computed pixel by pixel, and this change in the gray color is also used to compute the number of individual pixels that present some move or change in that level, according to a different change criteria specific to individual pixels. So the second move detection check is computed from the percentage of individual pixels showing move, and the move is considered as detected when any of those two methods trigger a detection.

When this happens the program pauses the live view in the JPARSEC binding to execute other commands to take a number of shots, resuming back after that the live view mode.

The program has as dependency the jsch library to uppload images to a server, and the itextpdf for its Base64 class used here to include images in the web server. The idea is to compile it with Java 6 at least. Here is the code of the main class.

GPhotoMoveDetection.java
import java.awt.Color;
import java.awt.Graphics2D;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetSocketAddress;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
 
import com.itextpdf.text.pdf.codec.Base64;
import com.sun.net.httpserver.HttpExchange;
import com.sun.net.httpserver.HttpHandler;
import com.sun.net.httpserver.HttpServer;
 
import jparsec.graph.DataSet;
import jparsec.graph.chartRendering.AWTGraphics;
import jparsec.io.ConsoleReport;
import jparsec.io.FileIO;
import jparsec.io.ReadFile;
import jparsec.io.device.implementation.GPhotoCamera;
import jparsec.io.device.implementation.GPhotoCamera.CAMERA_ID;
import jparsec.io.device.implementation.GPhotoCamera.CAMERA_PARAMETER;
import jparsec.io.image.Picture;
import jparsec.time.AstroDate;
import jparsec.vo.FTP;
 
/**
 * An example of using GPhoto library (through the binding integrated in JPARSEC) to control 
 * a DSLR camera and trigger a photo when movement is detected.
 * 
 * @author T. Alonso Albi - OAN (Spain)
 * @version 1.0
 */
public class GPhotoMoveDetection {
 
	// Export only this class to gphoto.jar and execute with:
	// java -classpath gphoto.jar:jparsec.jar:jsch-0.1.41.jar:jsch-0.1.41.jar:itextpdf-5.1.3.jar research.other.GPhotoMoveDetection 
 
	// TODO:
	// Option to pause motion detection ?
	// Server page reloading all the time, problematic with high fps. Reload by sections with different rates ?
	// Is it possible to trigger several shots quickly from shell ? Option to use the other method ?
	// Avoid overwriting old shots ?
	// Send e-mail with move alert ?
 
	public static GPhotoCamera c = null;
	public static String inISO, inAPERTURE, inCAPTURE_TARGET, inNIKON_QUALITY, inRESOLUTION, inSHUTTER_SPEED;
	public static String outISO, outAPERTURE, outCAPTURE_TARGET, outNIKON_QUALITY, outRESOLUTION, outSHUTTER_SPEED;
	public static FTP ftp = null;
	public static ArrayList<String> messages = new ArrayList<String>(), shotList = new ArrayList<String>();
	public static ArrayList<Object> thumbs = new ArrayList<Object>();
	public static int n = 0, nn = 0, moveEvents = 0;
	public static long lastMoveEvent;
	public static boolean forceExit = false, forcePause = false, createThumbs = true, keepInCamera = false;
	public static int webServerPort = 8080; // Server port (int)
	public static int liveMaxTime = 0; // sec, <= 0 => for ever
	public static int updateRef = 3600; // sec, -1 => never update
	public static int resample = 320; // width, -1 => no resample
	public static int thresholdBr = 10; // brightness % change to trigger global movement 
	public static int thresholdPx = 10; // brightness % change to trigger pixel movement 
	public static int movingPixPerc = 5; // % of pixel with change greater than threshold to trigger local movement
	public static int fps = -2; // fps of live view, < 0 for <1 fps (-2 => 0.5 fps)
	public static int nshot = 3; // number of shots to take
	public static int maxShot = 1000; // maximum number of shots to take
	public static boolean rename = true; // True to rename shots to 'shot_xxx.jpg'
	public static String sound = null;
	public static String server = null, user = null, pass = null, remoteDir = null; // to uppload images
	public static String extension = ".jpg"; // extension in lowercase for the new relevant images
	public static String mask[] = new String[] {
		"--------------------------------",
		"--------------------------------",
		"--------------------------------",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"++++++++++++++++++++++++++++++++",
		"--------------------------------",
		"--------------------------------",
		"--------------------------------",
	};
	public static boolean debug = true;
 
	/**
	 * Test program.
	 * @param args Unused.
	 */
	public static void main(String args[])
	{
		try {			
			String dir = FileIO.getWorkingDirectory();
			loadConfig(false);
 
			String out[] = GPhotoCamera.autoDetect();
			if (debug) System.out.println("Using gphoto "+GPhotoCamera.gphotoVersion);
			if (debug) System.out.println("Detected cameras:");
			ConsoleReport.stringArrayReport(out);
			c = new GPhotoCamera(CAMERA_ID.EOS40D, null, dir, false, debug);
 
			// Set custom camera parameters or read them from camera
			inISO = setParameter(inISO, CAMERA_PARAMETER.ISO);
			inAPERTURE = setParameter(inAPERTURE, CAMERA_PARAMETER.APERTURE);
			inCAPTURE_TARGET = setParameter(inCAPTURE_TARGET, CAMERA_PARAMETER.CAPTURE_TARGET);
			inNIKON_QUALITY = setParameter(inNIKON_QUALITY, CAMERA_PARAMETER.NIKON_QUALITY);
			inRESOLUTION = setParameter(inRESOLUTION, CAMERA_PARAMETER.RESOLUTION);
			inSHUTTER_SPEED = setParameter(inSHUTTER_SPEED, CAMERA_PARAMETER.SHUTTER_SPEED);
 
			c.setTimeLimitForLiveView(liveMaxTime);
			c.setLiveFPS(fps);
			c.setCopyInCamera(keepInCamera);
 
			if (debug) {
				CAMERA_PARAMETER cv[] = CAMERA_PARAMETER.values();
				for (int i=0; i<cv.length; i++)
				{
					System.out.println("Possible values of "+cv[i]);
					String values[] = c.getConfig(cv[i]);
					ConsoleReport.stringArrayReport(values);
				}
			}
 
			try {
				if (debug) System.out.println("Creating web server on port "+webServerPort);
				createServer(webServerPort);
			} catch (Exception e) {
				e.printStackTrace();
			}
 
			n = 0;
			nn = 0;
			long lastShotTime = -1;
			while(!forceExit) {
				if (forcePause) {
					if (debug) System.out.println("Live view is currently paused");					
				} else {
					if (c.isLivePaused()) {
						if (debug) System.out.println("Resuming live view");
						c.resumeLiveView();
						loadConfig(true);
					} else {
						if (debug) System.out.println("Starting live view");
						if (!c.isLive()) c.startLiveView(null, fps);
					}
				}
				long wait = (1000 / fps);
				if (fps < 0) wait = (long) (1000 / (1.0 / Math.abs(fps)));
				while (!forcePause) {
					try {
						if (forceExit) break;
						if (!c.isLive()) {
							if (debug) System.out.println("Detected live view was stopped (limit time). Waiting 5s and restarting live view mode ...");
							long t0 = System.currentTimeMillis();
							long t1 = t0 + 5000; // Wait 5s to restart live view
							while (true) {
								long t2 = System.currentTimeMillis();
								if (t2 < t1) {
									Thread.sleep(500);
									continue;
								}
								break;
							}
							String lastShot = c.getLastShotPath();
							if (lastShot != null && FileIO.exists(lastShot)) FileIO.deleteFile(lastShot);
							if (debug) System.out.println("Starting live view");
							c.startLiveView(null, fps);
						}
 
						String lastShot = c.getLastShotPath();
						boolean move = false;
						if (lastShot != null && FileIO.exists(lastShot)) {
							long time = (new File(lastShot)).lastModified();
							if (time != lastShotTime) {
								Picture pic = new Picture(lastShot);
								move = detectMove(pic);
							}
							lastShotTime = time;
						}
 
						if (move) {
							if (debug) System.out.println("Pausing live view");
							c.pauseLiveView();
							long t0 = System.currentTimeMillis();
							long t1 = t0 + wait * 2;
							while (true) {
								long t2 = System.currentTimeMillis();
								if (t2 < t1) {
									Thread.sleep(100);
									continue;
								}
								break;
							}
							break;
						}
 
						Thread.sleep(wait / 50);
					} catch (Exception exc) {
						exc.printStackTrace();
						continue;
					}
				}
 
				if (forceExit) break;
				if (forcePause) {
					if (!c.isLivePaused()) c.pauseLiveView();
					long t0 = System.currentTimeMillis();
					long t1 = t0 + wait * 2;
					while (true) {
						long t2 = System.currentTimeMillis();
						if (t2 < t1) {
							try {
								Thread.sleep(1000);
							} catch (InterruptedException e) {
								e.printStackTrace();
							}
							continue;
						}
						break;
					}
					continue;
				}
 
				// Shot/s after move detection
				moveEvents ++;
				lastMoveEvent = System.currentTimeMillis();
				addMessage("Move detected!");
				if (n > maxShot) {
					addMessage("Aborting taking shots: too much files (> "+maxShot+")");
				} else {
					// Shot inside shell using the binding
					String com = "capture-image-and-download";
					boolean lock = false;
 
					String path = "";
					for (int i=0; i<nshot; i++) {
						c.executeExternalCommandInPausedLiveView(com);
						long t0 = System.currentTimeMillis();
						while (true) {
							Thread.sleep(50);
							if (c.getExternalCommandInPausedLiveView() == null) break;
							if (System.currentTimeMillis() - t0 > 30000) {
								lock = true;
								break;
							}
						}
						if (lock) break;
						path += c.getLastShotPath()+",";
					}
					if (!path.equals("")) path = path.substring(0, path.length()-1);
 
					if (lock) addMessage("Detected camera freeze when taking shots (too much of them ?)");
					if (debug) System.out.println("Created files: "+path);
					n += nshot;
					if (!lock && !path.equals("")) {
						String p[] = DataSet.toStringArray(path, ",");
						for (int i=0; i<p.length; i++) {
							if (p[i].toLowerCase().endsWith(extension)) {
								if (debug) System.out.println("New shot found: "+p[i]);
								Picture pic = null;
								if (rename) {
									nn ++;
									pic = new Picture(p[i]);
									pic.write("shot"+nn+".jpg");
									FileIO.deleteFile(p[i]);
									p[i] = "shot"+nn+".jpg";
								}								
								if (createThumbs) {
									shotList.add(p[i]);
									if (pic == null) pic = new Picture(p[i]);
									if (resample > 0) pic.getScaledInstance(resample, 0, true);
									thumbs.add(pic.getImage());
								}
								if (ftp != null) {
									if (debug) System.out.println("Upploading "+p[i]+" to "+user+"@"+server);
									ftp.uppload(p[i], p[i]);
								}
							}
						}
					}
				}
 
				loadConfig(true);
				c.resumeLiveView();
 
				try {
					Thread.sleep(wait*2);
				} catch (Exception exc) {
					exc.printStackTrace();
				}
			}
 
			// Set at the end custom camera parameters or those read from camera at the beginning
			setParameter(inISO, outISO, CAMERA_PARAMETER.ISO);
			setParameter(inAPERTURE, outAPERTURE, CAMERA_PARAMETER.APERTURE);
			setParameter(inCAPTURE_TARGET, outCAPTURE_TARGET, CAMERA_PARAMETER.CAPTURE_TARGET);
			setParameter(inNIKON_QUALITY, outNIKON_QUALITY, CAMERA_PARAMETER.NIKON_QUALITY);
			setParameter(inRESOLUTION, outRESOLUTION, CAMERA_PARAMETER.RESOLUTION);
			setParameter(inSHUTTER_SPEED, outSHUTTER_SPEED, CAMERA_PARAMETER.SHUTTER_SPEED);
 
			if (debug) System.out.println("Exiting ...");
			if (c.isLivePaused()) c.resumeLiveView();
			c.stopLiveView();
			try {
				long wait = (1000 / fps);
				if (fps < 0) wait = (long) (1000 / (1.0 / Math.abs(fps)));
				long t0 = System.currentTimeMillis();
				long t1 = t0 + wait * 3;
				while (true) {
					long t2 = System.currentTimeMillis();
					if (t2 < t1) {
						try {
							Thread.sleep(100);
						} catch (InterruptedException e) {
							e.printStackTrace();
						}
						continue;
					}
					break;
				}
 
				System.exit(0);
			} catch (Exception exc) {
				exc.printStackTrace();
			}
 
		} catch (Exception exc)
		{
			exc.printStackTrace();
		}
	}
 
	public static String setParameter(String value, CAMERA_PARAMETER p) throws Exception {
		if (c == null || value == null) return value;
		if (value.equals("null")) {
			value = c.getParameterFromCamera(p);
		} else {
			c.setParameter(p, value);
		}
		return value;
	}
 
	public static void setParameter(String valueIn, String valueOut, CAMERA_PARAMETER p) throws Exception {
		if (c == null || valueIn == null || valueOut == null) return;
		if (valueOut.equals("null")) {
			c.setParameter(p, valueIn);
		} else {
			c.setParameter(p, valueOut);
		}
	}
 
	public static void loadConfig(boolean reload) throws Exception {
		String dir = FileIO.getWorkingDirectory(), config = dir + "config.txt";
		if (!FileIO.exists(config) && !reload) {
			System.out.println("Cannot find config file "+config);
			System.exit(0);
		}
 
		String d[] = DataSet.arrayListToStringArray(ReadFile.readAnyExternalFile(config));
		String param[] = new String[] {
			"UPDATE_REF", "RESAMPLE", "THRESHOLD_BR", "THRESHOLD_PX", "MOVING_PIX_PER", "FPS", 
			"NSHOTS", "MAX_SHOTS", "EXTENSION", "MASK", "RENAME", "SERVER", "USER", "PASSW", 
			"REMOTE_DIR", "LIVE_MAX_TIME", "WEB_SERVER_PORT", "CREATE_THUMBS", "KEEP_IN_CAMERA",
			"ISO_IN", "APERTURE_IN", "CAPTURE_TARGET_IN", "NIKON_QUALITY_IN", "RESOLUTION_IN", 
			"SHUTTER_SPEED_IN", "ISO_OUT", "APERTURE_OUT", "CAPTURE_TARGET_OUT", "NIKON_QUALITY_OUT", 
			"RESOLUTION_OUT", "SHUTTER_SPEED_OUT"
 
		};
		for (int i=0; i<d.length; i++) {
			for (int j=0; j<param.length; j++) {
				if (d[i].startsWith(param[j])) {
					if (debug) System.out.println("Reading "+d[i]);
					String val = d[i].substring(d[i].indexOf(" ")).trim();
					if (val.indexOf("//") > 0) val = val.substring(0, val.indexOf("//")).trim();
 
					if (j == 0) updateRef = Integer.parseInt(val);
					if (j == 1) resample = Integer.parseInt(val);
					if (j == 2) thresholdBr = Integer.parseInt(val);
					if (j == 3) thresholdPx = Integer.parseInt(val);
					if (j == 4) movingPixPerc = Integer.parseInt(val);
					if (j == 5) fps = Integer.parseInt(val);
					if (j == 6) nshot = Integer.parseInt(val);
					if (j == 7) maxShot = Integer.parseInt(val);
					if (j == 8) extension = val;
					if (j == 9) mask = DataSet.getSubArray(d, i+1, d.length-1);
					if (j == 10) rename = Boolean.parseBoolean(val);
					if (j == 11) server = val;
					if (j == 12) user = val;
					if (j == 13) pass = val;
					if (j == 14) remoteDir = val;
					if (j == 15) liveMaxTime = Integer.parseInt(val);
					if (j == 16) webServerPort = Integer.parseInt(val);
					if (j == 17) createThumbs = Boolean.parseBoolean(val);
					if (j == 18) keepInCamera = Boolean.parseBoolean(val);
 
					if (reload) continue;
					if (j == 19) inISO = val;
					if (j == 20) inAPERTURE = val;
					if (j == 21) inCAPTURE_TARGET = val;
					if (j == 22) inNIKON_QUALITY = val;
					if (j == 23) inRESOLUTION = val;
					if (j == 24) inSHUTTER_SPEED = val;
					if (j == 25) outISO = val;
					if (j == 26) outAPERTURE = val;
					if (j == 27) outCAPTURE_TARGET = val;
					if (j == 28) outNIKON_QUALITY = val;
					if (j == 29) outRESOLUTION = val;
					if (j == 30) outSHUTTER_SPEED = val;
				}
			}
		}
 
		if (ftp != null) ftp.disconnect();
		ftp = null;
		if (server != null && !server.equals("null") && user != null && !user.equals("null") 
				&& pass != null && !pass.equals("null")) 
			ftp = new FTP(server, user, pass);
		if (remoteDir != null && !remoteDir.equals("null")) ftp.changeDirectory(remoteDir);
 
		if (reload) addMessage("Configuration reloaded");
		if (c != null) {
			c.setLiveFPS(fps);
			c.setTimeLimitForLiveView(liveMaxTime);
			c.setCopyInCamera(keepInCamera);
		}
 
		String pp = dir + "sound.mp3";
		sound = null;
		if (FileIO.exists(pp)) {
			byte[] bytes = Files.readAllBytes(Paths.get(pp));
			sound = Base64.encodeBytes(bytes); 
			sound = "data:audio/mp3;base64,"+sound;
		}
	}
 
	public static Picture ref = null, lastPicColor = null, refColor = null;
	public static long lastRef = 0;
	public static boolean detectMove(Picture pic) throws Exception {
		if (resample > 0) pic.getScaledInstance(resample, 0, true);
 
		// Set reference at startup
		BufferedImage copy = Picture.copy(pic.getImage());
		if (refColor == null) {
			refColor = new Picture(copy);
			Graphics2D g2 = refColor.getImage().createGraphics();
			AWTGraphics.enableAntialiasing(g2);
			float fs = 20;
			g2.setColor(Color.RED);
			g2.setFont(g2.getFont().deriveFont(fs));
			g2.drawString(addMessage(null), fs, refColor.getHeight()-fs/2);
			g2.setColor(Color.WHITE);
			for (int y=0; y<pic.getHeight(); y++) {
				int cy = (int) (0.5 + (mask.length - 1.0) * (y / (pic.getHeight() - 1.0)));
				for (int x=0; x<pic.getWidth(); x++) {
					int cx = (int) (0.5 + (mask[cy].length() - 1.0) * (x / (pic.getWidth() - 1.0)));
					if (mask[cy].substring(cx, cx+1).equals("-"))
						continue;
					if (x % 3 != 0 || y % 3 != 0) continue;
					g2.drawLine(x-1, y, x+1, y);
					g2.drawLine(x, y+1, x, y-1);
				}
			}
			g2.dispose();
 
			pic.toGrayScale();
			ref = pic;
			lastRef = System.currentTimeMillis();
			addMessage("Reference created");
			return false;
		}
 
		// Detect movement
		lastPicColor = new Picture(pic.getImage());
		Graphics2D g2 = lastPicColor.getImage().createGraphics();
		AWTGraphics.enableAntialiasing(g2);
		float fs = 20;
		g2.setColor(Color.RED);
		g2.setFont(g2.getFont().deriveFont(fs));
		g2.drawString(addMessage(null), fs, refColor.getHeight()-fs/2);
		g2.setColor(Color.WHITE);
 
		pic.toGrayScale();
		boolean move = false;
		double histo[] = new double[255];
		int movingPixels = 0, maskedPixels = 0;		
		for (int y=0; y<pic.getHeight(); y++) {
			int cy = (int) (0.5 + (mask.length - 1.0) * (y / (pic.getHeight() - 1.0)));
			for (int x=0; x<pic.getWidth(); x++) {
				int cx = (int) (0.5 + (mask[cy].length() - 1.0) * (x / (pic.getWidth() - 1.0)));
				if (mask[cy].substring(cx, cx+1).equals("-")) {
					maskedPixels ++;
					continue;
				}
 
				Color c1 = pic.getColorAt(x, y);
				Color c2 = ref.getColorAt(x, y);
				int gray1 = c1.getRGB() & 255;
				int gray2 = c2.getRGB() & 255;
				int g = Math.abs(gray1 - gray2);
				histo[g] ++;
				if (g > thresholdPx*2.55) {
					movingPixels ++;
					if (x % 3 != 0 || y % 3 != 0) continue;
					g2.drawLine(x-1, y, x+1, y);
					g2.drawLine(x, y+1, x, y-1);
				}
			}			
		}
		g2.dispose();
		int maxIndex = (int) (DataSet.getIndexOfMaximum(histo) / 2.55);
		double percMov = movingPixels * 100.0 / (double) (pic.getWidth() * pic.getHeight() - maskedPixels);
		if (maxIndex > thresholdBr || percMov > movingPixPerc) move = true;
		if (debug) System.out.println("Global move: "+maxIndex+"/"+thresholdBr+". Local move: "+(float)percMov+"/"+movingPixPerc);
 
		// Update reference if required
		double elapsed = (System.currentTimeMillis() - lastRef) * 0.001;
		if (!move && updateRef > 0 && elapsed > updateRef) {
			ref = pic;
			refColor = new Picture(copy);
			lastRef = System.currentTimeMillis();			
 
			g2 = refColor.getImage().createGraphics();
			AWTGraphics.enableAntialiasing(g2);
			fs = 20;
			g2.setColor(Color.RED);
			g2.setFont(g2.getFont().deriveFont(fs));
			g2.drawString(addMessage(null), fs, refColor.getHeight()-fs/2);
			g2.setColor(Color.WHITE);
			for (int y=0; y<pic.getHeight(); y++) {
				int cy = (int) (0.5 + (mask.length - 1.0) * (y / (pic.getHeight() - 1.0)));
				for (int x=0; x<pic.getWidth(); x++) {
					int cx = (int) (0.5 + (mask[cy].length() - 1.0) * (x / (pic.getWidth() - 1.0)));
					if (mask[cy].substring(cx, cx+1).equals("-"))
						continue;
					if (x % 3 != 0 || y % 3 != 0) continue;
					g2.drawLine(x-1, y, x+1, y);
					g2.drawLine(x, y+1, x, y-1);
				}
			}
 
			addMessage("Reference updated");
		}
		return move;
	}
 
	public static String addMessage(String msg) {
		AstroDate astro = new AstroDate();
		String m = astro.toString();
		if (msg != null) {
			m += ": "+msg;
			messages.add(0, m);
 
			if (debug) System.out.println(msg);
		}
		return m;
	}
 
	public static String commands[] = new String[] {
		"Pause,left,Pausing web server",
		"Resume,center,Resuming web server",
		"Exit,right,Exiting"
	};
	public static String imgFormat = "jpg"; // jpg to reduce file size
	public static void createServer(int port) throws Exception {
        HttpServer server = HttpServer.create(new InetSocketAddress(port), 0);
        server.createContext("/", new MyHandler(-1));
 
        for (int i=0; i<commands.length; i++) {
        	String name = "com"+i;
 
            server.createContext("/"+name, new MyHandler(i));
        }
 
        server.setExecutor(null); // creates a default executor
        server.start();
	}
 
   static class MyHandler implements HttpHandler {
    	int index = -1;
 
    	public MyHandler() {
		}
 
    	public MyHandler(int i) {
    		this.index = i;
		}
 
        @Override
        public void handle(HttpExchange t) throws IOException {
        	StringBuffer response = new StringBuffer();
        	String sep = "<BR><BR>", step = FileIO.getLineSeparator();
 
			long wait = (1000 / fps);
			if (fps < 0) wait = (long) (1000 / (1.0 / Math.abs(fps)));
			int sec = (int) (wait / 1000 + 0.5);
    		response.append("<html><head><title>Web server</title><meta http-equiv=\"refresh\" content=\""+sec+", url=/\"></head><body bgcolor=\"#000000\">");
    		response.append("<center><H1 style=\"color: white\">GPhoto web server</H1></center>"+ step);
    		try {
    			response.append("<p style=\"color: white; font-size:30px; float: left\">Reference</p><p style=\"color: white; font-size:30px; float: right\">Live view (last image)</p>" + sep + step);
    			response.append("<div style=\"display: inline-block\">" + step);
        		if (refColor != null)
        			response.append("<img src=\""+refColor.imageToString(imgFormat, true)+"\" style=\"width: 49%; float: left\" />" + step);
        		if (lastPicColor != null)
        			response.append("<img src=\""+lastPicColor.imageToString(imgFormat, true)+"\" style=\"width: 49%; float: right\" />"+step);
        		response.append("</div>" + step);
    		} catch (Exception exc) {
    			exc.printStackTrace();
    		}
    		response.append(sep + "<HR>" + step);
    		if (c != null) {
    			response.append("<p style=\"color: white; font-size:20px\">Camera: "+c.getModel()+", port "+c.getPort()+"</p>"+step);
    			response.append("<p style=\"color: white; font-size:20px\">Working directory: "+c.getWorkingDirectory()+"</p>"+step);
        		String status = "running";
        		if (c.isLivePaused()) status = "paused";
    			response.append("<p style=\"color: white; font-size:20px\">Status: "+status+"</p>"+step);
    			response.append("<p style=\"color: white; font-size:20px\">Move events: "+moveEvents+"</p>"+step);
    			response.append("<p style=\"color: white; font-size:20px\">Shots taken: "+n+"</p>"+step);
    			String config = "liveMaxTime="+liveMaxTime+", ";
    			config += "updateRef="+updateRef+", ";
    			config += "resample="+resample+", ";
    			config += "thresholdBr="+thresholdBr+", ";
    			config += "thresholdPx="+thresholdPx+", ";
    			config += "movingPixPerc="+movingPixPerc+", ";
    			config += "fps="+fps+", ";
    			config += "nshot="+nshot+", ";
    			config += "maxShot="+maxShot+", ";
    			config += "rename="+rename+", ";
    			config += "createThumbs="+createThumbs+", ";
    			config += "keepInCamera="+keepInCamera+", ";
    			config += "server="+user+"@"+server+", ";
    			response.append("<p style=\"color: white; font-size:20px\">Configuration: "+config+"</p>"+step);
    		}
 
    		response.append("<HR>" + sep + step);
    		response.append("<div style=\"display: block; text-align: center\">" + step);
    		for (int i=0; i<3; i++) {
    			String title = FileIO.getField(1, commands[i], ",", false);
    			String align = FileIO.getField(2, commands[i], ",", false);
    			response.append("<input type=\"button\" style=\"color: white; float: "+align+"; font-size:40px\" onclick=\"location.href='/com"+i+"';\" value=\""+title+"\" />" + step);
    		}
    		response.append("</div>" + step);
 
    		if (index >= 0) {
    			String msg = FileIO.getRestAfterField(2, commands[index], ",", false).trim();
        		response.append("<center><H2 style=\"color: white\">" + msg + sep + step);
        		response.append("</H2></center>");
 
        		if (index == 0) {
        			forcePause = true;
        			addMessage("Paused");
        		}
        		if (index == 1) {
        			forcePause = false;
        			addMessage("Resumed");
        		}
        		if (index == 2) {
        			forceExit = true;
        			addMessage("Exited");
        		}
    		}
 
    		if (shotList != null && shotList.size() > 0) {
    			response.append("<HR>" + step);
 
    			try {
        			boolean left = true;
	        		for (int i=shotList.size()-1; i>=0; i--) {
	        			if (left) response.append("<div style=\"display: inline-block\">" + step);		        			
	        			Picture pic = new Picture((BufferedImage) thumbs.get(i));
	        			response.append("<img src=\""+pic.imageToString(imgFormat, true)+"\" style=\"width: 49%; float: "+(left ? "left" : "right")+"\" />" + step);
	        			if (!left) {
	        				response.append("</div>" + step);
	            			response.append("<p style=\"color: white; font-size:24px; float: left\">"+shotList.get(i+1)+"</p><p style=\"color: white; font-size:24px; float: right\">"+shotList.get(i)+"</p>" + step);
	        			} else {
	        				if (i == 0)
		            			response.append("<p style=\"color: white; font-size:24px; float: left\">"+shotList.get(i)+"</p>" + step);
	        			}
	        			left = !left;
	        		}
    			} catch (Exception exc) {
    				exc.printStackTrace();
    			}
 
    			response.append(sep + sep + sep + sep + sep);
    		}  
 
    		if (messages != null && messages.size() > 0) {
        		response.append("<HR>" + step);
 
        		String m[] = DataSet.arrayListToStringArray(messages);
        		for (int i=0; i<m.length; i++) {
        			response.append("<p style=\"color: white\"; font-size:16px>"+m[i]+"</p>" + step);
        		}
    		}
 
    		if (sound != null && lastMoveEvent > 0) {
    			double elapsed = (System.currentTimeMillis() - lastMoveEvent) * 0.001;
    			if (elapsed < sec) {
		    		response.append("<audio controls=\"controls\" autobuffer=\"autobuffer\" autoplay=\"autoplay\">"+step);
		    		response.append("<source src=\""+sound+"\" />"+step);
		    		response.append("</audio>"+step);
    			}
    		}
 
    		response.append("</body></html>");
 
            t.sendResponseHeaders(200, response.length());
            OutputStream os = t.getResponseBody();
            os.write(response.toString().getBytes());
            os.close();
        }
    }
}

The program includes a little web server to see the reference image with the move detection mask, the latest live view image showing the pixels with movement, some buttons to pause or stop the program, and some log messages. The server also shows the shots taken when move was detected. The next image shows how the web server looks like.


 

To access the web server in the same computer in which gphoto is running and the camera is connected you have to open a browser and load http://localhost:port, where port is the port number configured in the program (8080 by default). Of course knowing the computer IP you can load it from another device in the same network, or even from another place outside that network with a proper configuration of the router.

The full working program can be downloaded from this url. It should work with no modification in Linux and Mac. In Windows the launching script would need some modification (mainly change ; by : and write it to a .bat file), but anyway you first need to compile and install the gphoto library, so anyone capable of doing that would have no problem executing the program.

2018/01/19 13:32 · Tomás Alonso Albi

ClearSky for Android released

After 4 years developing from time to time a planetarium for Android (since the first experiments with this platform back in 2012), I have published the ClearSky planetarium in two versions: a free version quite generous in features and a paid version which is oriented to become useful to amateur astronomers with telescopes. This second page contains a detailed list of features of the commercial version, although the help document included in ClearSky describes everything with even more detail.

The free version contains almost everything a casual observer would need, including the possibility of showing comets and asteroids, which is usually offered only in commercial programs in Android. But the main difference from other tools is the great accuracy in JPARSEC, superior to most free and commercial programs available even on PC platform. In addition, Spanish and English are supported, with absolutely no adds.


 

Design and user interface

One of my main concerns when developing ClearSky has been to be as objective as possible, so I haven't taken any other Android planetarium as reference to think about the design or the features. After installing all other free planetariums I think most of them are not really helpful for an amateur astronomer, and even some commercial ones (I admit I haven't paid for any of them, I prefer to enjoy developing my own one) seems to have just 'more options' or objects in the paid version, instead of being focused to 'more activities' like observing with telescopes or planning observations with a list of astronomical events or objects. Most of the development comes from JPARSEC, even in terms of design of the different color squemes, and I have developed this library for years, so this Android planetarium has the benefit of years of experience with many little cosmetic and usability improvements. However, Android development is hard, you have to hit your head against hundreds of walls and overtake all problems until you end up with a finished product.

My point of view for an adequate planetarium for general public is a program that must be extremely easy and confortable to use. I have seen too many planetariums with great graphics, but where it is hard to drag and zoom the sky, and the objects are moving everytime. This seems to be a must for other developers just to aparently mimic the natural sky, but for astronomy you need to take some time to do things and you won't want to have everything moving. Despite this, there are programs reasonably well solved in this sense, for instance the Cosmos Celestron Navigator, but it is still hard to select a body and zoom in/out. In ClearSky the sky is updated in regular intervals, and zoom operations are fast and realiable, a simple click with a finger will identify an object and will show the distance respect the previous body identified, and a double click will center that body without the mess of a menu for just that (which in ClearSky is triggered with a long press offering, among others, the options of details of the object and to track it). An example of less well solved is SkEye, where the zoom and rotation operations will mess everything, making impossible to zoom in a given body. In addition, due to rotation and the use of equatorial positions you don't know where is up, where is the horizon, or the azimuth/elevation direction, and there are too many numbers which are not important for a user. SkEye is surprisingly well valorated despite all this and the fact that I've seen three degrees of error in the position of Jupiter, so I won't call it accurate… ClearSky, although it is not as beautiful as other planetariums (although I think well enough worked in that sense), it has a much more useful user interface, making the program a really useful and confortable tool, not a simple toy to play a few minutes with. For instance, you can directly change from drag to zoom or the opposite all the time (keeping always at least one finger on the screen), something not possible in other planetariums.


 

Another issue for me is to make the user interface reasonably beautiful and confortable to the eye. This means using the adequate number and distribution of options and adequate icons. In the main window the icons have colors, and there are six for completely different tasks. SkEye shows 9!, which are too many for a phone and some of them are related to the same thing, like changing simulation conditions or the aspect, things that are not oriented to 'activities'. In ClearSky you have a search button which is a must, an option to change the color squeme which is just useful because many users expect and will play with that, and a help/trivia/more options icon (it is configurable) on the right corner of the first row (easier to click with the finger). The trivia is a game to play, just funny if you play it for a few minutes from time to time. I think it is important to have at least one configurable option for the user. The second row shows the view mode (live, text, augmented reality, and chart modes), the list of astronomical events, and the configuration option, again easier to click in that position. Since in the free version there are only two modes (live and chart), the first option will simply swap both of them, in the paid version there is a menu inevitably. The astronomical events is another must in my opinion, since a user will want to know what's going on in the sky, and if the full moon or the astronomical twilight will limit the time interval of really dark skies. All configuration options are provided through the configuration option, although the most important of them are provided in the configurable option, when it is configured to show a menu called 'more options'. The number of configurable options are a lot, almost 100, but they are properly categorized and offered in two levels, showing by default less options. The interfaz section has an option to change for a simple to a complete user interface, showing all posible options.


 

Text mode

The text mode is something probably missed in any other Android planetarium. It is only available in the commercial version, since this feature is specific to amateur astronomers, not the general public. It lists all deep sky objects and the main double/variable stars, sorted by name, object type, magnitude, position in the sky, or transit time. Supernovae and novae will also appear if you enable them, but only in case they are visible in the sky also (field of view around 50 degrees or lower). There is an option to set an alarm for the transit time (as in the events obviously), in case you want to observe some bodies in their greatest elevations. You can set an object as reference, so that the azimuth/elevation position (or right ascension/declination in case you prefer equatorial coordinates in the configuration) will appear as offsets respect that reference body. The position column will change to distance, so you can sort objects by their distances to the reference one. Very useful to observe interesting objects close to the one you are currently observing. In case you go to the text mode from the live view mode, the text mode will be also 'live', sorting the objects by default respect their distances to the direction the device points to. If you hold the device adequately on top of the telescope, you can use this feature to convert any telescope into a push-to one. In case the device doesn't point perfectly to the object, there is an align option in this case, replacing the option to set an object as reference. In case you just want a few objects you like you can add objects to a list of favourites and show only them.


 

Astronomical events

The list of astronomical events is useful to prepare an observation night. In addition to the main general events offered in the free version, the paid version includes events related to natural satellites (main Jupiter, but also for Saturn and Uranus). The events for natural satellites will show even mutual events of natural satellites (for instance a moon of Jupiter occulting or eclipsing another moon), something very interesting for amateur astronomers and probably only offered in ClearSky because of its great accuracy. The list of events for artificial satellites will show the next transits of the main satellites (ISS, HST, and Tiangong 1), as well as their transits on top of the solar and lunar disks. Iridium flares are also computed and simulated.


 

Astronomical equipment

The commercial version of ClearSky can show the field of view of any telescope with horizontal or equatorial mounts. When a telescope is selected, a long click on a star will add another option in the context menu allowing to test the polar alignment of that equipment in that star and in that moment. The program asks for four values (see documentation) and will compute from the deviations of the star (measured in pixels in the camera) where the mount is really pointing to, so that an incremental correction to improve the alignment is possible. The feature is not completely in its final status, but works.


 

Other features

The links provided at the top will list most of the features of ClearSky, but I would like to emphasize some of them. First, the catalog of stars, and specially deep sky objects, are very robust. I worked myself on the deep sky catalog for years, fixing and improving things with time and checking coordinates with Simbad (it is based on the revised NGC).

You can reach magnitude 16 in the commercial programs with the only condition of having network connection (previously downloaded fields can be used offline too). Other programs requires 1 GB of star data in your device, which is absurd for a phone. It is something spectacular to resolve globular clusters like M13 in stars, even without deep sky textures, and how star positions and textures match completely.

700 textures of deep sky objects are overlaided on the sky with great accuracy, corrected by precesion and nutation. The commercial version has high resolution textures with the possibility of downloading more.

Visual quality and accuracy are worked to a great level, so that accurate and realistic planetary rendering is possible in ClearSky. It is not as fast as I would like, but good enough (illumination and everything is done pixel by pixel without 3d OpenGL). You can even identify planetary features when clicking with the finger, something in fact used to allow simulating the sky from other bodies.


 

There are also useful color squemes, like these two screenshots show:


 

And if you are curious about the trivia, here are two screenshots of it. There are more than 200 different questions, implemented in Spanish and English, although many of them are related to identifying objects or constellations.


 

Accuracy

I will not add more words about accuracy, I have already talked about that before, comparing JPARSEC with other PC programs. In ClearSky the accuracy means this program is suitable to studies in the field of ancient arqueoastronomy, since proper motions of stars (the natural change in the shape of constellations) are considered, and planets will appear up to year 3000 B.C. Obviously more accuracy means more complicated and slower algorithms, and this is less compatible with showing the sky in real time (with objects moving), specially in a platform with strong memory and speed limitations. I don't like it, but I admit it can be required to track artificial satellites which moves fast. I currently have a list of a few bugs I have to correct, and some features which will be implemented with time to end up with a product of my (and hopefully also others) like. The hard work is done, but this is just the first release.

2015/12/24 12:48 · Tomás Alonso Albi

Astronomical Trivia

I'm currently very busy with different things in parallel, it seems uncertain when I would finish each of them. I've done a review in the last days about all of them as a frame to a talk this evening at Agrupación Astronómica de Madrid (AAM), the main astronomical amateur group in Madrid. The talk is about a tool to simulate astronomical events as they would be visible through a given kind of telescope, to prepare observations (in the science section there is a link to this presentation, in Spanish). This tool is currently limited to fellows at AAM, but could be opened to general public in a near future. Among others, I will talk about the recent esthetic upgrade to the ephemerides server, the project to create an automated observatory (more on this not before the end of the summer…), the Android planetarium (current beta available at the main JPARSEC page is quite good already, although with some bugs), and different minor things done during the years. Most, if not all of them, have been documented more or less in this blog or other pages (projects page for instance), but for one of them there is nothing about it here: the astronomical trivia.

I created a little astronomical trivia for a talk last november in the frame of the Semana de la Ciencia in Madrid, which is a two week event with lots of scientific talks about many different areas organized between many institutes in Madrid. As an experiment, I decided to do this little trivia game to let the public enjoy the 20 minutes of time before starting, when the people is still coming. I wanted to see if this thing could be or not attractive to give some fun to the people waiting, and also to prepare their minds with questions related to the talk.


 

The result was indeed quite positive. I saw all kind of faces, from people knowing everything to others showing just the opposite very clearly, and others simply laughing. Of course, there is always someone that just didn't notice at all that a trivia game was in the screen…

The program is available for downloading at this url. It should work on all operating systems (with Java installed), and in fact includes a .sh file to run it on Linux and Mac, and a .bat for Windows. A text file is used to configure everything (including the background images and the questions themselves) using a strict or rigid format, but anyway quite clear I think. The trivia contains 40 questions translated both to Spanish and to English, with different difficulty levels. Since it is fully customizable, you can replace everything to create a trivia game about any other scientific field. Windows requires a special file to account for the different carriage return character in this system, so in case you use it be careful which file you modify and in which system.

I plan to add something similar to the Android planetarium…

2015/04/07 15:51 · Tomás Alonso Albi

Recentering the disk of the Sun or Moon in eclipses

We recently (March 20) had a partial solar eclipse. At OAN we have very limited resources to do visual observing and to try to show a live view of an eclipse to the public. This time we bought a Celestron zoom ocular 8-24 mm, that includes a T2 mount for DSLR cameras, and T2 adapters for Canon and Nikon cameras. We already have a Coronado H∝ telescope, and among the staff at OAN we own different Canon and Nikon DSLRs. The idea was to show a live view of the eclipse through our eclipse web page, but weather conditions in Spain were horrible that day and it was impossible. Anyway, it was really nice to see the great interest of the people in the eclipse, and our web page became possibly the best of all Spanish web pages dedicated to the eclipse, reaching 75 000 visits the day of the eclipse and more than 200 000 in a few weeks around the date of the eclipse.

One of the main problems when showing a live view of the Sun was to recenter and crop the disk of the Sun, so that the disk seems static to the people watching it in the web page. For this task I prepared a little program to recenter and crop it, with many options like adding a few labels and upploading it directly to the server.

The program uses a simple algorithm to calculate the smallest enclosing circle for a set of points. The set of points are computed from the photo, taking the edges of the Sun (or Moon) disk and separating them from the background. Background separation is done in a very simplified way, good enough for eclipses, but not for other uses. In fact, since weather was poor a few tests that were possible showed that this separation did not work fine, since there was scattered light produced by the clouds and it was difficult for the algorithm to separate disk and background. In sunny conditions (with a background dark enough) the process worked as expected in the tests performed days before the eclipse. The algorithm, in fact, seems quite robust. The only weak point is that all points computed at the edge of the disk must be in fact there, even with one wrong point among 1000 correct ones the cropped image can be different from the expected one.


 

Previous image shows the before and after result of applying this process to an H∝ imagen taken with a Nikon camera. We did tests using both real H∝ and simulated images (using as reference an H∝ image, and also optical ones). Respect the simulated images, we tested this algorithm against a number of situations, from solar to lunar eclipses, different zoom scales, having the Sun disk partially out of the frame, and with/without background stars. All tests showed good results in the recentering algorithm, which allowed to obtain a video with the sequence of these eclipses. The images for these tests, with the Sun and Moon disk displaced respect the center, were generated using the JPARSEC library.

The following videos show the results of applying this program to the simulated sequences of the solar and lunar eclipses of 2015. Click on the links below the video frame to show each of them. The lunar eclipse sequence is created using the debug mode of the program, that shows the set of points around the edge of the disk used to later get the enclosing circle. As you can see, the algorithm is quite fast, processing a 10Mpx image in 0.5 seconds or less in modern PCs (with the debug mode activated the process is slower).

Solar eclipse H∝ / Lunar eclipse

The program is available for downloading at this url. It should work on all operating systems. A recenter.txt file is used for configuring the different options of the program, like input/output image, labels, the border to leave around the cropped image, file upploading, and many others. Two nice features are to obtain the date of the photo from the EXIF data of the JPG file using this library, and to uppload it directly to a server using the Jsch library. The hot pixel parameter controls the maximum size a star or image defect can have so that artifacts with a size less than this are not considered to lie at the edge of the disk.

Fortunately there is a transit of Mercury next year, so this work can still be of use. I also expect it to be useful for others with the same problem.

2015/03/21 12:48 · Tomás Alonso Albi

Gallery

Selected astronomical images taken with my equipment. As you will see, I'm not a professional photographer, but I do my best. Some of them are old images taken with a film camera, recent ones uses a digital SLR. In addition to the images I have also some videos:

Solar eclipse Solar annular eclipse on October, 3, 2005. It was my first day at OAN so I couldn't use my instrumental. I simply took the camera on my hands.

Venus transit: Venus transit on the Sun on July, 8, 2004. The black drop effect is clearly visible.

Solar eclipse: Solar eclipse of August, 11, 1999, as was shot through my S/C 20 cm telescope.

Log of visits

 
blog.txt · created: 2010/01/31 01:56 (Last modified 2014/10/23 12:51) by Tomás Alonso Albi
 
Recent changes RSS feed Creative Commons License Donate Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki