Physics Data & Astronomical Technology (PDAT) laboratory
Physics science is based on observational data and new technologies. The " Physics Data & Astronomical Technology (PDAT) laboratory" was established by Dr. Javad T. Firouzjaee in 2021 with the aim of using these two factors, namely observational data and new technologies. This laboratory follows two main paths. The first path is to use the tools of analysis, mining, and application of the given data in different fields of study, especially in cosmology and astronomy. The second is to build astronomical tools for observing astronomical data. With this in mind, the lab began in some projects in a variety of areas, such as:
A brief description of these projects is given in below:
20.1 MHz radio telescope In the Fall of 2020, The Radio Astronomy research team of the Physics Department at K. N. Toosi University of Technology started to build a 20.1 MHz classical half-wave radio telescope under the supervision of Dr. T. Firouzjaee. Ehsan Mahdavi and Hojat Fathi, members of the group, started to assemble this radio telescope with the goal to study Jupiter and solar activities. In general, a radio telescope has 3 parts: 1) Antenna 2) Receiver 3) Data processor The half-wave dipole antenna consists of 2 conductive wires (copper) which have been separated by a small insulator each other and the total length of the whole set is equal to the half wave-length of the receiving electromagnetic wave. When the sun rays hit the antenna's copper wire, a tiny voltage in the order of microvolts produces at the antenna terminals. It has been considered that the antenna installed away from power lines. The Radio Frequency Signal produced at the antenna terminals will be sent to the receiver by a coaxial cable (a cable with low resistance). The length of the coaxial cable used between antenna and receiver must be a multiple of half wave-length. The receiver amplifies and filters these signals to increase the signal-to-noise ratio and blocks the noise from radio stations in the city. the receiver has been tuned to 20.1 MHz and amplifies the signal by resonating in that frequency. The amplified signal by the receiver should be analyzed. To do so, we use a computer to conserve long-term data. we can derive information by drawing the diagram of intensity. By analyzing the chart we can find out about solar activities and the time of solar bursts. we can compare the time of detected solar bursts to the average number of sunspots in that time and show the correlation between them. we can also study the correlation between solar activities and the weather.
Construction of Radio Telescope Antenna 1420 MHz The construction of this telescope began with the establishment of the Laboratory at K. N. Toosi University. In less than one year, we were able to build a 20-MHz radio telescope. In the next step, we aim to build a 2.3-meter radio telescope with 1420 MHz frequency which is suitable to detect hydrogen 21cm emission line. In the first step, we prepared a suitable antenna. In the next step, with the cooperation of a team, under the supervision of Dr. Ali Akbarian of the Faculty of Electrical Engineering, we built its antenna. Now this project is passing its final steps.
Machine learning in astrophysics Dr. Zhoolideh Haghighi team My research interests include extragalactic astronomy, the cluster of galaxies and their dynamical relaxation, and machine learning in astronomy. In our projects, we use state-of-the-art neural network algorithms like CNN, LSTM, and GAN. Thanks to modern surveys and instruments, there is a great deal of astronomical data in the context of images, spectrums, and data of gravitational waves, which can be analyzed utilizing machine learning technics. My team and I are working on these exciting domains, and we are trying to provide a better understanding of the universe by applying machine learning to big astronomical data.
Calculating the energy of extensive air showers by machine learning methods Dr. Hedayati team The cosmic rays that make up extensive air showers come in three forms: Hadronic, Muonic and Electromagnetic. Air showers contain a high-energy hadron core that regularly and continuously feeds their magnetic field. This is done by photons produced by the decay of neutral pions and eta particles. Each energetic photon forms a subset of electromagnetic air shower that alternately produces pairs and bremsstrahlungs. High-energy nucleons and hadrons form hadron air showers. Pions and low-energy Kaons collapse and form muonic compounds. In this research, the parallel simulation method is used to simulate extensive air showers, which has been made possible with the help of parallel CPUs that are connected to the main system. Thus, extensive air showers with different amounts of energy are simulated using the CORSIKA multi CPU configuration. We also use file zilla to better control the ongoing processes. We've got one main CPU and three peripheral CPUs in this configuration. When starting the simulation, the main CPU executes it and if there was a need to run other simulation programs, it will be sent to other empty CPUs. Thus when a new air shower is released, the main CPU sends the model to one of the empty CPUs and receives the final files from there. The advantage of this method over direct and single CPU use on the Linux operating system is its processing speed in air showers which are produced from high-energy primary particles, because these particles produce a lot of secondary air showers and, processing them by a typical CPU will be time-consuming and sometimes practically impossible. The purpose of these simulations is to have a large amount of data to use the machine learning method and write a program to analyze real data. Machine learning is one of the subfields of computer science that tries to find solutions to problems. This method is effective when we do not know the right solution and can not write a detailed plan for it, but with the machine learning method, we can study a large number of samples and obtain the solution. This way, machine learning can be considered as a computational method to improve performance or increase accuracy in speculations. This method often uses data that has been collected in advance and is available to users and researchers. Nonetheless, what affects the accuracy of the researcher's speculations is the quality and quantity of the data. The algorithms, the complexity of the theoretical concepts, and the number of initial data used in this method depend on the complexity of the model we want to design. Machine learning methods can also help us to reduce costs. In this research, by performing sequential simulations as described above, we can obtain the required data and try to study and analyze this data to write a program that estimates the energy based on the characteristics of the primary particles. Since in this simulation we have access to all of the information, we can make sure that the simulated data model is correct, and finally, we will apply the results of this method to the actual data of cosmic ray observatories.
|