# Nikola Vitas

## Wednesday, 17 October 2018

### Three-dimensional simulations of solar magneto-convection including effects of partial ionization

Astronomy & Astrophysics Volume 618 (October 2018) is just out with a figure from our paper on the cover!

The paper (Khomenko, Vitas, Collados & de Vicente 2018: A&A, ADS) describes the effects of the partial ionization on the structure, dynamics and energy balance of the low chromosphere.

Labels:
astronomy,
chromosphere,
magnetic fields,
MANCHA,
MHD,
papers,
partial ionization,
solar atmosphere

Location:
Santa Cruz de Tenerife, Spain

## Tuesday, 4 July 2017

### How to display magnetic field lines in IDL?

It is a common problem in visualization of magnetic fields. If we assume that the potential $A$ of the magnetic field is knows, the field lines are, by definition, iso-$A$ curves. The quickest way to display them is by using the CONTOUR procedure. If the potential field is given as a 2D variable potential, then:

`IDL> CONTOUR, potential, levels = levels, /xs, /ys`

gives:
However, the information here is not complete without showing the actual direction of the field along the lines. It is easy to do it in IDL:`IDL> CONTOUR, potential, levels = levels, /xs, /ys IDL> CONTOUR, potential, levels = levels, /xs, /ys, path_xy = c IDL> FOR i = 1, N_ELEMENTS(c)/2-1, 50 DO $`

`ARROW, c[0, i-1], c[1, i-1], c[0, i], c[1, i], /norm, /solid, hsize = 5`

Labels:
IDL,
magnetic fields,
solar atmosphere,
teaching,
tips

Location:
Santa Cruz de Tenerife, Spain

## Tuesday, 11 April 2017

### Deep-learning about horizontal velocities at the solar surface

The velocity fields are of great importance for understanding dynamics and structure of the solar atmosphere. The line of sight velocities are coded in the wavelength shifts of the spectral lines, thanks to the Doppler effect, and relatively easy to measure. On the other hand, the orthogonal ("horizontal") components of the velocity vector are impossible to measure directly.

The most popular method for estimating the horizontal velocities is so-called local correlation tracking (LCT, November & Simon, 1988). It is based on comparing successive images of the solar surface in the continuum light and transforming their differences into information about the horizontal fields. However, the LCT algorithm suffers from several limitations.

In a paper by Andres Asensio Ramos and Iker S. Requerey (with a small contribution from my side) accepted by A&A and published on Arxiv some weeks ago (2017arXiv170305128A) this problem is tackled by the deep-learning approach. A deep fully convolutional neural network is trained on synthetic observations from 3D MHD simulations of the solar photosphere and then applied to the real observation with the IMaX instrument on board the SUNRISE balloon (Martinez Pillet et al, 2011; Solanki, 2010). The method is validated using simulation snapshots of the quiet sun produced with the MANCHA code that I have been developing in the last couple of years.

The most popular method for estimating the horizontal velocities is so-called local correlation tracking (LCT, November & Simon, 1988). It is based on comparing successive images of the solar surface in the continuum light and transforming their differences into information about the horizontal fields. However, the LCT algorithm suffers from several limitations.

In a paper by Andres Asensio Ramos and Iker S. Requerey (with a small contribution from my side) accepted by A&A and published on Arxiv some weeks ago (2017arXiv170305128A) this problem is tackled by the deep-learning approach. A deep fully convolutional neural network is trained on synthetic observations from 3D MHD simulations of the solar photosphere and then applied to the real observation with the IMaX instrument on board the SUNRISE balloon (Martinez Pillet et al, 2011; Solanki, 2010). The method is validated using simulation snapshots of the quiet sun produced with the MANCHA code that I have been developing in the last couple of years.

Labels:
data mining,
IAC,
MANCHA,
solar atmosphere,
sun,
tools

Location:
Santa Cruz de Tenerife, Spain

## Monday, 20 March 2017

### K-means clustering

The problem of clustering is a rather general one: If one has $m$ observations or measurements in $n$ dimensional space, how to identify $k$ clusters (classes, groups, types) of measurements and their centroids (representatives)?

The k-means method is extremely simple, rather robust and widely used in it numerous variants. It is essentially very similar (but not identical) to Lloyd's algorithm (aka Voronoi relaxation or interpolation used in computer sciences).

Let's use the following indices: $i$ counts measurements, $i \in [0, m-1]$; $j$ counts dimensions, $j \in [0, n-1]$; $l$ counts clusters, $l \in [0, k-1]$.

Each measurement in $n$-dimensional space is represented by a vector $x_i = \{x_{i, 0}, \dots x_{i, n-1}\}$, where index $i$ is counting different measurements ($i = 0, \dots, m-1$). The algorithm can be summarized as:

1. Choose randomly $k$ measurements as initial cluster centers: $c_0, ..., c_{k-1}$. Obviously, each of the clusters is also $n$-dimensional vector.

2. Compute Euclidean distance $D_{i, l}$ between every measurement $x_i$ and every cluster center $c_l$:

$$D_{i, l} = \sqrt{\sum_{j=0}^{n-1} (x_{i, j} - c_{l, j})^2}.$$

3. Assign every measurement $x_i$ to the cluster represented by the closest cluster center $c_l$.

4. Now compute new cluster centers by simply averaging all the measurements in each cluster.

5. Go back to 2. and keep iterating until none of the measurements changes its cluster in two successive iterations.

This procedure is initiated randomly and the result will be slightly different in every run. The result of clustering (and the actual number of necessary iteration) significantly depends on the initial choice of cluster centers. The easiest way to improve the algorithm is to improve the initial choice, i.e. to alter only the step 1. and then to iterate as before. There are to simple alternatives for the initialization.

The k-means method is extremely simple, rather robust and widely used in it numerous variants. It is essentially very similar (but not identical) to Lloyd's algorithm (aka Voronoi relaxation or interpolation used in computer sciences).

**k-means**Let's use the following indices: $i$ counts measurements, $i \in [0, m-1]$; $j$ counts dimensions, $j \in [0, n-1]$; $l$ counts clusters, $l \in [0, k-1]$.

Each measurement in $n$-dimensional space is represented by a vector $x_i = \{x_{i, 0}, \dots x_{i, n-1}\}$, where index $i$ is counting different measurements ($i = 0, \dots, m-1$). The algorithm can be summarized as:

1. Choose randomly $k$ measurements as initial cluster centers: $c_0, ..., c_{k-1}$. Obviously, each of the clusters is also $n$-dimensional vector.

2. Compute Euclidean distance $D_{i, l}$ between every measurement $x_i$ and every cluster center $c_l$:

$$D_{i, l} = \sqrt{\sum_{j=0}^{n-1} (x_{i, j} - c_{l, j})^2}.$$

3. Assign every measurement $x_i$ to the cluster represented by the closest cluster center $c_l$.

4. Now compute new cluster centers by simply averaging all the measurements in each cluster.

5. Go back to 2. and keep iterating until none of the measurements changes its cluster in two successive iterations.

This procedure is initiated randomly and the result will be slightly different in every run. The result of clustering (and the actual number of necessary iteration) significantly depends on the initial choice of cluster centers. The easiest way to improve the algorithm is to improve the initial choice, i.e. to alter only the step 1. and then to iterate as before. There are to simple alternatives for the initialization.

Labels:
astronomy,
data mining,
IDL,
statistics

Location:
Santa Cruz de Tenerife, Spain

## Wednesday, 1 February 2017

### First observation of linear polarization in the forbidden [OI] 630.03 nm line

In a new paper (de Wijn, Socas-Navarro & Vitas, 2017, ApJ, 836, 29D) we present the first results of our observations of a sunspot and an active region using the SP/SOT instrument on board the Hinode satellite. The novelty in our observation is a trick that we used to double the standard wavelength range observed by the instrument. Thanks to that, we were able to see the sun not only in the two iron lines at 630.2 nm, but also in four other lines. One of those is particularly interesting: the forbidden ground-based line of neutral oxygen ([OI] 630.03 nm). It is one of only few oxygen lines in the solar spectrum and probably the best diagnostics of the solar oxygen abundance. For the first time ever we observed the linear polarization in this line! As an M2 (magnetic dipole) transition, it is predicted by the theory (Landi degl'Innocenti and Landi, 2004, Section 6.8) that this line produces the linear polarization signal with the opposite sign to the lines produced by E2 transitions. It is also the first time that linear polarization in M2 and nearby E2 lines is measured simultaneously, so that the flip in sign is obvious (see the left-most spectral line in the red circle in the Figure; in linear polarization it has "W" shape, while all other lines in the wavelength range have "M" shapes). This result may bring new light to the ongoing debate on the solar oxygen crisis.

More details of this unique observation will appear soon in a follow-up publication.

Labels:
abundances,
astronomy,
atomic physics,
data,
papers,
solar atmosphere,
spectropolarimetry,
sun

Location:
Santa Cruz de Tenerife, Spain

## Wednesday, 30 November 2016

### 1D Solar Atmosphere Models in IDL: Penumbra by Ding & Fang (1989)

Plane-parallel atmosphere in hydrostatic equilibrium published by Ding & Fang (1989, "A semi-empirical model of sunspots penumbra", 1989A&A...225..204D). Statistical equilibrium for hydrogen model-atom with 12 levels plus continuum. The model is produced by fitting observations of penumbra in 2 lines of H and 5 lines of Ca. The observations were carried out on McMath telescope at Kitt Peak National Observatory. The observed sunspot was small, rounded and close to the disk center. The field strength in the umbra was around 1.25 kG and 560 G in the penumbra.

It is interesting to note their Fig.2 (see it below). In the deep photosphere, the temperature in this model is similar to the temperature in the model of Yun et al. (1981). However, between the optical depths -3 and -4 it becomes close to the VALC model (Vernazza et al, 1981).

It is interesting to note their Fig.2 (see it below). In the deep photosphere, the temperature in this model is similar to the temperature in the model of Yun et al. (1981). However, between the optical depths -3 and -4 it becomes close to the VALC model (Vernazza et al, 1981).

Labels:
astronomy,
data,
IDL,
one-dimensional models,
solar atmosphere,
sun

Location:
Santa Cruz de Tenerife, Spain

## Friday, 21 October 2016

### Installing IDL v7.1 on Kubuntu 16.04 (on MacBook pro)

The IDL installation went smooth as usual, but when I tried to run IDL, there was an error:

error while loading shared libraries: libXp.so.6: cannot open shared object file: No such file or directory

In the IDL help pages ( IDL fails to install on Linux: What to do, scroll down to the Ubuntu section) there is a comment on that that turned only half useful. Two libraries I got installed with no problem:

sudo apt-get install libxmu-dev

sudo apt-get install libxmu6

but the other three were not found in repositories. Instead of manual installation (two of them are available from http://packages.ubuntu.com/xenial/, but the last one is a bit mysterious. Google finds it only in three pages related to IDL, one of them being the previously mentioned page from the IDL docs). Instead, I got if from Git. As I have just installed Kubuntu there was still some important packages missing, so some of the following steps may be redundant for you.

sudo apt-get install autoconf autogen intltool

sudo apt-get install git

sudo apt-get install xutils-dev libtool libx11-dev

x11proto-xext-dev x11proto-print-dev

git clone https://cgit.freedesktop.org/xorg/lib/libXp/

Then cd to libXp directory and execute

sudo ./autogen.sh

In my case if was complaining about the line 18214 related to XPRINT. I commented out that line from the script and executed it again. After that everything is straight forward:

sudo ./configure

sudo make install

And finally I add the path to the library to .bashrc (all in one line):

echo 'export LD_LIBRARY_PATH="/usr/local/lib/:$LD_LIBRARY_PATH"' >> ~/.bashrc

After that IDL worked normally.

Subscribe to:
Posts (Atom)