Search results for: statistical physics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4490

Search results for: statistical physics

4010 Resonant Fluorescence in a Two-Level Atom and the Terahertz Gap

Authors: Nikolai N. Bogolubov, Andrey V. Soldatov

Abstract:

Terahertz radiation occupies a range of frequencies somewhere from 100 GHz to approximately 10 THz, just between microwaves and infrared waves. This range of frequencies holds promise for many useful applications in experimental applied physics and technology. At the same time, reliable, simple techniques for generation, amplification, and modulation of electromagnetic radiation in this range are far from been developed enough to meet the requirements of its practical usage, especially in comparison to the level of technological abilities already achieved for other domains of the electromagnetic spectrum. This situation of relative underdevelopment of this potentially very important range of electromagnetic spectrum is known under the name of the 'terahertz gap.' Among other things, technological progress in the terahertz area has been impeded by the lack of compact, low energy consumption, easily controlled and continuously radiating terahertz radiation sources. Therefore, development of new techniques serving this purpose as well as various devices based on them is of obvious necessity. No doubt, it would be highly advantageous to employ the simplest of suitable physical systems as major critical components in these techniques and devices. The purpose of the present research was to show by means of conventional methods of non-equilibrium statistical mechanics and the theory of open quantum systems, that a thoroughly studied two-level quantum system, also known as an one-electron two-level 'atom', being driven by external classical monochromatic high-frequency (e.g. laser) field, can radiate continuously at much lower (e.g. terahertz) frequency in the fluorescent regime if the transition dipole moment operator of this 'atom' possesses permanent non-equal diagonal matrix elements. This assumption contradicts conventional assumption routinely made in quantum optics that only the non-diagonal matrix elements persist. The conventional assumption is pertinent to natural atoms and molecules and stems from the property of spatial inversion symmetry of their eigenstates. At the same time, such an assumption is justified no more in regard to artificially manufactured quantum systems of reduced dimensionality, such as, for example, quantum dots, which are often nicknamed 'artificial atoms' due to striking similarity of their optical properties to those ones of the real atoms. Possible ways to experimental observation and practical implementation of the predicted effect are discussed too.

Keywords: terahertz gap, two-level atom, resonant fluorescence, quantum dot, resonant fluorescence, two-level atom

Procedia PDF Downloads 271
4009 Effectiveness of Homoeopathic Medicine Conium Maculatum 200 C for Management of Pyuria

Authors: Amir Ashraf

Abstract:

Homoeopathy is an alternative system of medicine discovered by German physician Samuel Hahnemann in 1796. It has been used by several people for various health conditions globally for more than last 200 years. In India, homoeopathy is considered as a major system of alternative medicine. Homoeopathy is found effective in various medical conditions including Pyuria. Pyuria is the condition in which pus cells are found in urine. Homoeopathy is very useful for reducing pus cells, and homeopathically potentized Conium Mac (Hemlock) is an important remedy commonly used for reducing pyuria. Aim: To reduce the amount pus cells found in urine using Conium Mac 200C. Methods: Design. Small N Design. Samples: Purposive Sampling with 5 cases diagnosed as pyuria. Tools: Personal Data Schedule and ICD-10 Criteria for Pyuria. Techniques: Potentized homoeopathic medicine, Conium Mac 200th potency is used. Statistical Analysis: The statistical analyses were done using non-parametric tests. Results: There is significant pre/post difference has been identified. Conclusion: Homoeopathic potency, Conium Mac 200 C is effective in reducing the increased level of pus cells found in urine samples.

Keywords: homoeopathy, alternative medicine, Pyuria, Conim Mac, small N design, non-parametric tests, homeopathic physician, Ashirvad Hospital, Kannur

Procedia PDF Downloads 335
4008 Computer Software for Calculating Electron Mobility of Semiconductors Compounds; Case Study for N-Gan

Authors: Emad A. Ahmed

Abstract:

Computer software to calculate electron mobility with respect to different scattering mechanism has been developed. This software is adopted completely Graphical User Interface (GUI) technique and its interface has been designed by Microsoft Visual Basic 6.0. As a case study the electron mobility of n-GaN was performed using this software. The behaviour of the mobility for n-GaN due to elastic scattering processes and its relation to temperature and doping concentration were discussed. The results agree with other available theoretical and experimental data.

Keywords: electron mobility, relaxation time, GaN, scattering, computer software, computation physics

Procedia PDF Downloads 670
4007 Statistical Model to Examine the Impact of the Inflation Rate and Real Interest Rate on the Bahrain Economy

Authors: Ghada Abo-Zaid

Abstract:

Introduction: Oil is one of the most income source in Bahrain. Low oil price influence on the economy growth and the investment rate in Bahrain. For example, the economic growth was 3.7% in 2012, and it reduced to 2.9% in 2015. Investment rate was 9.8% in 2012, and it is reduced to be 5.9% and -12.1% in 2014 and 2015, respectively. The inflation rate is increased to the peak point in 2013 with 3.3 %. Objectives: The objectives here are to build statistical models to examine the effect of the interest rate inflation rate on the growth economy in Bahrain from 2000 to 2018. Methods: This study based on 18 years, and the multiple regression model is used for the analysis. All of the missing data are omitted from the analysis. Results: Regression model is used to examine the association between the Growth national product (GNP), the inflation rate, and real interest rate. We found that (i) Increase the real interest rate decrease the GNP. (ii) Increase the inflation rate does not effect on the growth economy in Bahrain since the average of the inflation rate was almost 2%, and this is considered as a low percentage. Conclusion: There is a positive impact of the real interest rate on the GNP in Bahrain. While the inflation rate does not show any negative influence on the GNP as the inflation rate was not large enough to effect negatively on the economy growth rate in Bahrain.

Keywords: growth national product, egypt, regression model, interest rate

Procedia PDF Downloads 164
4006 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow

Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam

Abstract:

Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.

Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety

Procedia PDF Downloads 302
4005 Regional Flood-Duration-Frequency Models for Norway

Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu

Abstract:

Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.

Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV

Procedia PDF Downloads 72
4004 Examination of the Main Behavioral Patterns of Male and Female Students in Islamic Azad University

Authors: Sobhan Sobhani

Abstract:

This study examined the behavioral patterns of student and their determinants according to the "symbolic interaction" sociological perspective in the form of 7 hypotheses. Behavioral patterns of students were classified in 8 categories: religious, scientific, political, artistic, sporting, national, parents and teachers. They were evaluated by student opinions by a five-point Likert rating scale. The statistical population included all male and female students of Islamic Azad University, Behabahan branch, among which 600 patients (268 females and 332 males) were selected randomly. The following statistical methods were used: frequency and percentage, mean, t-test, Pearson correlation coefficient and multi-way analysis of variance. The results obtained from statistical analysis showed that: 1-There is a significant difference between male and female students in terms of disposition to religious figures, artists, teachers and parents. 2-There is a significant difference between students of urban and rural areas in terms of assuming behavioral patterns of religious, political, scientific, artistic, national figures and teachers. 3-The most important criterion for selecting behavioral patterns of students is intellectual understanding with the pattern. 4-The most important factor influencing the behavioral patterns of male and female students is parents followed by friends. 5-Boys are affected by teachers, the Internet and satellite programs more than girls. Girls assume behavioral patterns from books more than boys. 6-There is a significant difference between students in human sciences, technical, medical and engineering disciplines in terms of selecting religious and political figures as behavioral patterns. 7-There is a significant difference between students belonging to different subcultures in terms of assuming behavioral patterns of religious, scientific and cultural figures. 8-Between the first and fourth year students in terms of selecting behavioral patterns, there is a significant difference only in selecting religious figures. 9-There is a significant negative correlation between the education level of parents and the selection of religious and political figures and teachers. 10-There is a significant negative correlation between family income and the selection of political and religious figures.

Keywords: behavioral patterns, behavioral patterns, male and female students, Islamic Azad University

Procedia PDF Downloads 365
4003 Simulation Based Analysis of Gear Dynamic Behavior in Presence of Multiple Cracks

Authors: Ahmed Saeed, Sadok Sassi, Mohammad Roshun

Abstract:

Gears are important components with a vital role in many rotating machines. One of the common gear failure causes is tooth fatigue crack; however, its early detection is still a challenging task. The objective of this study is to develop a numerical model that simulates the effect of teeth cracks on the resulting gears vibrations and permits consequently to perform an early fault detection. In contrast to other published papers, this work incorporates the possibility of multiple simultaneous cracks with different depths. As cracks alter significantly the stiffness of the tooth, finite element software is used to determine the stiffness variation with respect to the angular position, for different combinations of crack orientation and depth. A simplified six degrees of freedom nonlinear lumped parameter model of a one-stage spur gear system is proposed to study the vibration with and without cracks. The model developed for calculating the stiffness with the crack permitted to update the physical parameters of the second-degree-of-freedom equations of motions describing the vibration of the gearbox. The vibration simulation results of the gearbox were by obtained using Simulink/Matlab. The effect of one crack with different levels was studied thoroughly. The change in the mesh stiffness and the vibration response were found to be consistent with previously published works. In addition, various statistical time domain parameters were considered. They showed different degrees of sensitivity toward the crack depth. Multiple cracks were also introduced at different locations and the vibration response along with the statistical parameters were obtained again for a general case of degradation (increase in crack depth, crack number and crack locations). It was found that although some parameters increase in value as the deterioration level increases, they show almost no change or even decrease when the number of cracks increases. Therefore, the use of any statistical parameters could be misleading if not considered in an appropriate way.

Keywords: Spur gear, cracked tooth, numerical simulation, time-domain parameters

Procedia PDF Downloads 266
4002 High-Fidelity Materials Screening with a Multi-Fidelity Graph Neural Network and Semi-Supervised Learning

Authors: Akeel A. Shah, Tong Zhang

Abstract:

Computational approaches to learning the properties of materials are commonplace, motivated by the need to screen or design materials for a given application, e.g., semiconductors and energy storage. Experimental approaches can be both time consuming and costly. Unfortunately, computational approaches such as ab-initio electronic structure calculations and classical or ab-initio molecular dynamics are themselves can be too slow for the rapid evaluation of materials, often involving thousands to hundreds of thousands of candidates. Machine learning assisted approaches have been developed to overcome the time limitations of purely physics-based approaches. These approaches, on the other hand, require large volumes of data for training (hundreds of thousands on many standard data sets such as QM7b). This means that they are limited by how quickly such a large data set of physics-based simulations can be established. At high fidelity, such as configuration interaction, composite methods such as G4, and coupled cluster theory, gathering such a large data set can become infeasible, which can compromise the accuracy of the predictions - many applications require high accuracy, for example band structures and energy levels in semiconductor materials and the energetics of charge transfer in energy storage materials. In order to circumvent this problem, multi-fidelity approaches can be adopted, for example the Δ-ML method, which learns a high-fidelity output from a low-fidelity result such as Hartree-Fock or density functional theory (DFT). The general strategy is to learn a map between the low and high fidelity outputs, so that the high-fidelity output is obtained a simple sum of the physics-based low-fidelity and correction, Although this requires a low-fidelity calculation, it typically requires far fewer high-fidelity results to learn the correction map, and furthermore, the low-fidelity result, such as Hartree-Fock or semi-empirical ZINDO, is typically quick to obtain, For high-fidelity outputs the result can be an order of magnitude or more in speed up. In this work, a new multi-fidelity approach is developed, based on a graph convolutional network (GCN) combined with semi-supervised learning. The GCN allows for the material or molecule to be represented as a graph, which is known to improve accuracy, for example SchNet and MEGNET. The graph incorporates information regarding the numbers of, types and properties of atoms; the types of bonds; and bond angles. They key to the accuracy in multi-fidelity methods, however, is the incorporation of low-fidelity output to learn the high-fidelity equivalent, in this case by learning their difference. Semi-supervised learning is employed to allow for different numbers of low and high-fidelity training points, by using an additional GCN-based low-fidelity map to predict high fidelity outputs. It is shown on 4 different data sets that a significant (at least one order of magnitude) increase in accuracy is obtained, using one to two orders of magnitude fewer low and high fidelity training points. One of the data sets is developed in this work, pertaining to 1000 simulations of quinone molecules (up to 24 atoms) at 5 different levels of fidelity, furnishing the energy, dipole moment and HOMO/LUMO.

Keywords: .materials screening, computational materials, machine learning, multi-fidelity, graph convolutional network, semi-supervised learning

Procedia PDF Downloads 39
4001 Impact of Yogic Exercise on Cardiovascular Function on Selected College Students of High Altitude

Authors: Benu Gupta

Abstract:

The purpose of the study was to assess the impact of yogic exercise on cardiovascular exercises on selected college students of high altitude. The research was conducted on college students of high altitude in Shimla for their cardiovascular function [Blood Pressure (BP), VO2 Max (TLC) and Pulse Rate (PR)] in respect to yogic exercise. Total 139 students were randomly selected from Himachal University colleges in Shimla. The study was conducted in three phases. The subjects were identified in the first phase of research program then further in next phase they were physiologically tested, and yogic exercise battery was operated in different time frame. The entire subjects were treated with three months yogic exercise. The entire lot of students were again evaluated physiologically [(Cardiovascular measurement: Blood Pressure (BP), VO2 Max (TLC) and Pulse Rate (PR)] with standard equipments. The statistical analyses of the variance (PR, BP (SBP & DBP) and TLC) were done. The result reveals that there was a significant difference in TLC; whereas there was no significant difference in PR. For BP statistical analysis suggests no significant difference were formed. Result showed that the BP of the participants were more inclined towards normal standard BP i.e. 120-80 mmHg.

Keywords: cardiovascular function, college students, high altitude, yogic exercise

Procedia PDF Downloads 231
4000 Statistical Tools for SFRA Diagnosis in Power Transformers

Authors: Rahul Srivastava, Priti Pundir, Y. R. Sood, Rajnish Shrivastava

Abstract:

For the interpretation of the signatures of sweep frequency response analysis(SFRA) of transformer different types of statistical techniques serves as an effective tool for doing either phase to phase comparison or sister unit comparison. In this paper with the discussion on SFRA several statistics techniques like cross correlation coefficient (CCF), root square error (RSQ), comparative standard deviation (CSD), Absolute difference, mean square error(MSE),Min-Max ratio(MM) are presented through several case studies. These methods require sample data size and spot frequencies of SFRA signatures that are being compared. The techniques used are based on power signal processing tools that can simplify result and limits can be created for the severity of the fault occurring in the transformer due to several short circuit forces or due to ageing. The advantages of using statistics techniques for analyzing of SFRA result are being indicated through several case studies and hence the results are obtained which determines the state of the transformer.

Keywords: absolute difference (DABS), cross correlation coefficient (CCF), mean square error (MSE), min-max ratio (MM-ratio), root square error (RSQ), standard deviation (CSD), sweep frequency response analysis (SFRA)

Procedia PDF Downloads 697
3999 Physics-Based Earthquake Source Models for Seismic Engineering: Analysis and Validation for Dip-Slip Faults

Authors: Percy Galvez, Anatoly Petukhin, Paul Somerville, Ken Miyakoshi, Kojiro Irikura, Daniel Peter

Abstract:

Physics-based dynamic rupture modelling is necessary for estimating parameters such as rupture velocity and slip rate function that are important for ground motion simulation, but poorly resolved by observations, e.g. by seismic source inversion. In order to generate a large number of physically self-consistent rupture models, whose rupture process is consistent with the spatio-temporal heterogeneity of past earthquakes, we use multicycle simulations under the heterogeneous rate-and-state (RS) friction law for a 45deg dip-slip fault. We performed a parametrization study by fully dynamic rupture modeling, and then, a set of spontaneous source models was generated in a large magnitude range (Mw > 7.0). In order to validate rupture models, we compare the source scaling relations vs. seismic moment Mo for the modeled rupture area S, as well as average slip Dave and the slip asperity area Sa, with similar scaling relations from the source inversions. Ground motions were also computed from our models. Their peak ground velocities (PGV) agree well with the GMPE values. We obtained good agreement of the permanent surface offset values with empirical relations. From the heterogeneous rupture models, we analyzed parameters, which are critical for ground motion simulations, i.e. distributions of slip, slip rate, rupture initiation points, rupture velocities, and source time functions. We studied cross-correlations between them and with the friction weakening distance Dc value, the only initial heterogeneity parameter in our modeling. The main findings are: (1) high slip-rate areas coincide with or are located on an outer edge of the large slip areas, (2) ruptures have a tendency to initiate in small Dc areas, and (3) high slip-rate areas correlate with areas of small Dc, large rupture velocity and short rise-time.

Keywords: earthquake dynamics, strong ground motion prediction, seismic engineering, source characterization

Procedia PDF Downloads 144
3998 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 110
3997 Socio-Economic and Psychological Factors of Moscow Population Deviant Behavior: Sociological and Statistical Research

Authors: V. Bezverbny

Abstract:

The actuality of the project deals with stable growing of deviant behavior’ statistics among Moscow citizens. During the recent years the socioeconomic health, wealth and life expectation of Moscow residents is regularly growing up, but the limits of crime and drug addiction have grown up seriously. Another serious Moscow problem has been economical stratification of population. The cost of identical residential areas differs at 2.5 times. The project is aimed at complex research and the development of methodology for main factors and reasons evaluation of deviant behavior growing in Moscow. The main project objective is finding out the links between the urban environment quality and dynamics of citizens’ deviant behavior in regional and municipal aspect using the statistical research methods and GIS modeling. The conducted research allowed: 1) to evaluate the dynamics of deviant behavior in Moscow different administrative districts; 2) to describe the reasons of crime increasing, drugs addiction, alcoholism, suicides tendencies among the city population; 3) to develop the city districts classification based on the level of the crime rate; 4) to create the statistical database containing the main indicators of Moscow population deviant behavior in 2010-2015 including information regarding crime level, alcoholism, drug addiction, suicides; 5) to present statistical indicators that characterize the dynamics of Moscow population deviant behavior in condition of expanding the city territory; 6) to analyze the main sociological theories and factors of deviant behavior for concretization the deviation types; 7) to consider the main theoretical statements of the city sociology devoted to the reasons for deviant behavior in megalopolis conditions. To explore the level of deviant behavior’ factors differentiation, the questionnaire was worked out, and sociological survey involved more than 1000 people from different districts of the city was conducted. Sociological survey allowed to study the socio-economical and psychological factors of deviant behavior. It also included the Moscow residents’ open-ended answers regarding the most actual problems in their districts and reasons of wish to leave their place. The results of sociological survey lead to the conclusion that the main factors of deviant behavior in Moscow are high level of social inequality, large number of illegal migrants and bums, nearness of large transport hubs and stations on the territory, ineffective work of police, alcohol availability and drug accessibility, low level of psychological comfort for Moscow citizens, large number of building projects.

Keywords: deviant behavior, megapolis, Moscow, urban environment, social stratification

Procedia PDF Downloads 192
3996 Multi-Elemental Analysis Using Inductively Coupled Plasma Mass Spectrometry for the Geographical Origin Discrimination of Greek Giant Beans “Gigantes Elefantes”

Authors: Eleni C. Mazarakioti, Anastasios Zotos, Anna-Akrivi Thomatou, Efthimios Kokkotos, Achilleas Kontogeorgos, Athanasios Ladavos, Angelos Patakas

Abstract:

“Gigantes Elefantes” is a particularly dynamic crop of giant beans cultivated in western Macedonia (Greece). This variety of large beans growing in this area and specifically in the regions of Prespes and Kastoria is a protected designation of origin (PDO) species with high nutritional quality. Mislabeling of geographical origin and blending with unidentified samples are common fraudulent practices in Greek food market with financial and possible health consequences. In the last decades, multi-elemental composition analysis has been used in identifying the geographical origin of foods and agricultural products. In an attempt to discriminate the authenticity of Greek beans, multi-elemental analysis (Ag, Al, As, B, Ba, Be, Ca, Cd, Co, Cr, Cs, Cu, Fe, Ga, Ge, K, Li, Mg, Mn, Mo, Na, Nb, Ni, P, Pb, Rb, Re, Se, Sr, Ta, Ti, Tl, U, V, W, Zn, Zr) was performed by inductively coupled plasma mass spectrometry (ICP-MS) on 320 samples of beans, originated from Greece (Prespes and Kastoria), China and Poland. All samples were collected during the autumn of 2021. The obtained data were analysed by principal component analysis (PCA), an unsupervised statistical method, which allows for to reduce of the dimensionality of the enormous datasets. Statistical analysis revealed a clear separation of beans that had been cultivated in Greece compared with those from China and Poland. An adequate discrimination of geographical origin between bean samples originating from the two Greece regions, Prespes and Kastoria, was also evident. Our results suggest that multi-elemental analysis combined with the appropriate multivariate statistical method could be a useful tool for bean’s geographical authentication. Acknowledgment: This research has been financed by the Public Investment Programme/General Secretariat for Research and Innovation, under the call “YPOERGO 3, code 2018SE01300000: project title: ‘Elaboration and implementation of methodology for authenticity and geographical origin assessment of agricultural products.

Keywords: geographical origin, authenticity, multi-elemental analysis, beans, ICP-MS, PCA

Procedia PDF Downloads 78
3995 Application of Stochastic Models to Annual Extreme Streamflow Data

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.

Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river

Procedia PDF Downloads 148
3994 Development of Trigger Tool to Identify Adverse Drug Events From Warfarin Administered to Patient Admitted in Medical Wards of Chumphae Hospital

Authors: Puntarikorn Rungrattanakasin

Abstract:

Objectives: To develop the trigger tool to warn about the risk of bleeding as an adverse event from warfarin drug usage during admission in Medical Wards of Chumphae Hospital. Methods: A retrospective study was performed by reviewing the medical records for the patients admitted between June 1st,2020- May 31st, 2021. ADEs were evaluated by Naranjo’s algorithm. The international normalized ratio (INR) and events of bleeding during admissions were collected. Statistical analyses, including Chi-square test and Reciever Operating Characteristic (ROC) curve for optimal INR threshold, were used for the study. Results: Among the 139 admissions, the INR range was found to vary between 0.86-14.91, there was a total of 15 bleeding events, out of which 9 were mild, and 6 were severe. The occurrence of bleeding started whenever the INR was greater than 2.5 and reached the statistical significance (p <0.05), which was in concordance with the ROC curve and yielded 100 % sensitivity and 60% specificity in the detection of a bleeding event. In this regard, the INR greater than 2.5 was considered to be an optimal threshold to alert promptly for bleeding tendency. Conclusions: The INR value of greater than 2.5 (>2.5) would be an appropriate trigger tool to warn of the risk of bleeding for patients taking warfarin in Chumphae Hospital.

Keywords: trigger tool, warfarin, risk of bleeding, medical wards

Procedia PDF Downloads 148
3993 Simulation of Reflectometry in Alborz Tokamak

Authors: S. Kohestani, R. Amrollahi, P. Daryabor

Abstract:

Microwave diagnostics such as reflectometry are receiving growing attention in magnetic confinement fusionresearch. In order to obtain the better understanding of plasma confinement physics, more detailed measurements on density profile and its fluctuations might be required. A 2D full-wave simulation of ordinary mode propagation has been written in an effort to model effects seen in reflectometry experiment. The code uses the finite-difference-time-domain method with a perfectly-matched-layer absorption boundary to solve Maxwell’s equations.The code has been used to simulate the reflectometer measurement in Alborz Tokamak.

Keywords: reflectometry, simulation, ordinary mode, tokamak

Procedia PDF Downloads 420
3992 The Grand Unified Theory of Everything as a Generalization to the Standard Model Called as the General Standard Model

Authors: Amir Deljoo

Abstract:

The endeavor to comprehend the existence have been the center of thought for human in form of different disciplines and now basically in physics as the theory of everything. Here, after a brief review of the basic frameworks of thought, and a history of thought since ancient up to present, a logical methodology is presented based on a core axiom after which a function, a proto-field and then a coordinates are explained. Afterwards a generalization to Standard Model is proposed as General Standard Model which is believed to be the base of the Unified Theory of Everything.

Keywords: general relativity, grand unified theory, quantum mechanics, standard model, theory of everything

Procedia PDF Downloads 100
3991 Dynamical Relation of Poisson Spike Trains in Hodkin-Huxley Neural Ion Current Model and Formation of Non-Canonical Bases, Islands, and Analog Bases in DNA, mRNA, and RNA at or near the Transcription

Authors: Michael Fundator

Abstract:

Groundbreaking application of biomathematical and biochemical research in neural networks processes to formation of non-canonical bases, islands, and analog bases in DNA and mRNA at or near the transcription that contradicts the long anticipated statistical assumptions for the distribution of bases and analog bases compounds is implemented through statistical and stochastic methods apparatus with addition of quantum principles, where the usual transience of Poisson spike train becomes very instrumental tool for finding even almost periodical type of solutions to Fokker-Plank stochastic differential equation. Present article develops new multidimensional methods of finding solutions to stochastic differential equations based on more rigorous approach to mathematical apparatus through Kolmogorov-Chentsov continuity theorem that allows the stochastic processes with jumps under certain conditions to have γ-Holder continuous modification that is used as basis for finding analogous parallels in dynamics of neutral networks and formation of analog bases and transcription in DNA.

Keywords: Fokker-Plank stochastic differential equation, Kolmogorov-Chentsov continuity theorem, neural networks, translation and transcription

Procedia PDF Downloads 406
3990 Assessing the Self-Directed Learning Skills of the Undergraduate Nursing Students in a Medical University in Bahrain: A Quantitative Study

Authors: Catherine Mary Abou-Zaid

Abstract:

This quantitative study discusses the concerns with the self-directed learning (SDL) skills of the undergraduate nursing students in a medical university in Bahrain. The nursing undergraduate student SDL study was conducted taking all 4 years and compiling data collected from the students themselves by survey questionnaire. The aim of the study is to understand and change the attitudes of self-directed learning among the undergraduate students. The SDL of the undergraduate student nurses has been noticed to be lacking and motivation to actually perform without supervision while out-with classrooms are very low. Their use of the resources available on the virtual learning environment and also within the university is not as good as it should be for a university student at this level. They do not use them to their own advantage. They are not prepared for the transition from high school to an academic environment such as a university or college. For some students it is the first time in their academic lives that they have faced sharing a classroom with the opposite sex. For some this is a major issue and we as academics need to be aware of all issues that they come to higher education with. Design Methodology: The design methodology that was chosen was a quantitative design using convenience sampling of the students who would be asked to complete survey questionnaire. This sampling method was chosen because of the time constraint. This was completed by the undergraduate students themselves while in class. The questionnaire was analyzed by the statistical package for social sciences (SPSS), the results interpreted by the researcher and the findings published in the paper. The analyzed data will also be reported on and from this information we as educators will be able to see the student’s weaknesses regarding self-directed learning. The aims and objectives of the research will be used as recommendations for the improvement of resources for the students to improve their SDL skills. Conclusion: The results will be able to give the educators an insight to how we can change the self-directed learning techniques of the students and enable them to embrace the skills and to focus more on being self-directed in their studies rather than having to be put on to a SDL pathway from the educators themselves. This evidence will come from the analysis of the statistical data. It may even change the way in which the students are selected for the nursing programme. These recommendations will be reported to the head of school and also to the nursing faculty.

Keywords: self-directed learning, undergraduate students, transition, statistical package for social sciences (SPSS), higher education

Procedia PDF Downloads 315
3989 Coastal Environment: Statistical Analysis and Geomorphic Impact on Urban Tourism in Lagos, Portugal

Authors: Magdalena Kuleta

Abstract:

Ponta de Piedade (37º05 ' N, 08º40 ' W) is an area located in the southern part of the Lagos municipality, which include an abrasive and accumulative type of coastline. It is the one of the main touristic destinations of the city. The dynamic development of the attractiveness of the coast, is related with the expansion of the new tourism infrastructure and urban tourism products. These products are: transportation, sightseeing and entertainment in the form of the boat trips. Each type of excursion refers to the different product. This progress brings also many risks associated primarily with landslides cliffs. Natural conditions affecting the coast, create a huge impact on the evolution of urban tourism management. Based on observation, statistical analysis and survey method, author compare the period of six years from 2012 to 2016 in terms of the number of tourists, number and diversity of attractions, most frequently dialled products and infrastructure changes in the city. Carried methodology is based on data belonging to Turismo Portugal and the tourist company Days of Adventure. Main result, is to indicate the essence of the income from coastal tourism into the city development and how does it influence on the marketing and promoting of urban tourism in Lagos.

Keywords: geomorphology of the coast in Lagos, market and promotion, quality of tourism service, urban tourism products

Procedia PDF Downloads 317
3988 Effect of Electronic Banking on the Performance of Deposit Money Banks in Nigeria: Using ATM and Mobile Phone as a Case Study

Authors: Charity Ifunanya Osakwe, Victoria Ogochuchukwu Obi-Nwosu, Chima Kenneth Anachedo

Abstract:

The study investigates how automated teller machines (ATM) and mobile banking affect deposit money banks in the Nigerian economy. The study made use of time series data which were obtained from the Central Bank of Nigeria Statistical Bulletin from 2009 to 2021. The Central Bank of Nigeria (CBN) data on automated teller machine and mobile phones were used to proxy electronic banking while total deposit in banks proxied the performance of deposit money banks. The analysis for the study was done using ordinary least square econometric technique with the aid of economic view statistical package. The results show that the automated teller machine has a positive and significant effect on the total deposits of deposit money banks in Nigeria and that making use of deposits of deposit money banks in Nigeria. It was concluded in the study that e-banking has equally increased banking access to customers and also created room for banks to expand their operations to more customers. The study recommends that banks in Nigeria should prioritize the expansion and maintenance of ATM networks as well as continue to invest in and develop more mobile banking services.

Keywords: electronic, banking, automated teller machines, mobile, deposit

Procedia PDF Downloads 53
3987 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management

Authors: Chokri Slim

Abstract:

The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.

Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines

Procedia PDF Downloads 150
3986 Improving Road Infrastructure Safety Management Through Statistical Analysis of Road Accident Data. Case Study: Streets in Bucharest

Authors: Dimitriu Corneliu-Ioan, Gheorghe FrațIlă

Abstract:

Romania has one of the highest rates of road deaths among European Union Member States, and there is a concern that the country will not meet its goal of "zero deaths" by 2050. The European Union also aims to halve the number of people seriously injured in road accidents by 2030. Therefore, there is a need to improve road infrastructure safety management in Romania. The aim of this study is to analyze road accident data through statistical methods to assess the current state of road infrastructure safety in Bucharest. The study also aims to identify trends and make forecasts regarding serious road accidents and their consequences. The objective is to provide insights that can help prioritize measures to increase road safety, particularly in urban areas. The research utilizes statistical analysis methods, including exploratory analysis and descriptive statistics. Databases from the Traffic Police and the Romanian Road Authority are analyzed using Excel. Road risks are compared with the main causes of road accidents to identify correlations. The study emphasizes the need for better quality and more diverse collection of road accident data for effective analysis in the field of road infrastructure engineering. The research findings highlight the importance of prioritizing measures to improve road safety in urban areas, where serious accidents and their consequences are more frequent. There is a correlation between the measures ordered by road safety auditors and the main causes of serious accidents in Bucharest. The study also reveals the significant social costs of road accidents, amounting to approximately 3% of GDP, emphasizing the need for collaboration between local and central administrations in allocating resources for road safety. This research contributes to a clearer understanding of the current road infrastructure safety situation in Romania. The findings provide critical insights that can aid decision-makers in allocating resources efficiently and institutionally cooperating to achieve sustainable road safety. The data used for this study are collected from the Traffic Police and the Romanian Road Authority. The data processing involves exploratory analysis and descriptive statistics using the Excel tool. The analysis allows for a better understanding of the factors contributing to the current road safety situation and helps inform managerial decisions to eliminate or reduce road risks. The study addresses the state of road infrastructure safety in Bucharest and analyzes the trends and forecasts regarding serious road accidents and their consequences. It studies the correlation between road safety measures and the main causes of serious accidents. To improve road safety, cooperation between local and central administrations towards joint financial efforts is important. This research highlights the need for statistical data processing methods to substantiate managerial decisions in road infrastructure management. It emphasizes the importance of improving the quality and diversity of road accident data collection. The research findings provide a critical perspective on the current road safety situation in Romania and offer insights to identify appropriate solutions to reduce the number of serious road accidents in the future.

Keywords: road death rate, strategic objective, serious road accidents, road safety, statistical analysis

Procedia PDF Downloads 84
3985 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank

Authors: Jalal Haghighat Monfared, Zahra Akbari

Abstract:

Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.

Keywords: business intelligence, business intelligence capability, decision making, decision quality

Procedia PDF Downloads 112
3984 Simulation of Performance of LaBr₃ (Ce) Using GEANT4

Authors: Zarana Dave

Abstract:

Cerium-doped lanthanum bromide, LaBr₃ (Ce), scintillator shows attracting properties for spectroscopy that makes it a suitable solution for security, medical, geophysics and high energy physics applications. Here, the performance parameters of a cylindrical LaBr₃ (Ce) scintillator was investigated. The first aspect is the determination of the efficiency for γ - ray detection, measured with GEANT4 simulation toolkit from 10keV to 10MeV energy range. The second is the detailed study of background radiation of LaBr₃ (Ce). It has relatively high intrinsic radiation background due to naturally occurring ¹³⁸La and ²²⁷Ac radioisotopes.

Keywords: LaBr₃(Ce), GEANT4, efficiency, background radiation

Procedia PDF Downloads 221
3983 Quantum Information Scrambling and Quantum Chaos in Silicon-Based Fermi-Hubbard Quantum Dot Arrays

Authors: Nikolaos Petropoulos, Elena Blokhina, Andrii Sokolov, Andrii Semenov, Panagiotis Giounanlis, Xutong Wu, Dmytro Mishagli, Eugene Koskin, Robert Bogdan Staszewski, Dirk Leipold

Abstract:

We investigate entanglement and quantum information scrambling (QIS) by the example of a many-body Extended and spinless effective Fermi-Hubbard Model (EFHM and e-FHM, respectively) that describes a special type of quantum dot array provided by Equal1 labs silicon-based quantum computer. The concept of QIS is used in the framework of quantum information processing by quantum circuits and quantum channels. In general, QIS is manifest as the de-localization of quantum information over the entire quantum system; more compactly, information about the input cannot be obtained by local measurements of the output of the quantum system. In our work, we will first make an introduction to the concept of quantum information scrambling and its connection with the 4-point out-of-time-order (OTO) correlators. In order to have a quantitative measure of QIS we use the tripartite mutual information, in similar lines to previous works, that measures the mutual information between 4 different spacetime partitions of the system and study the Transverse Field Ising (TFI) model; this is used to quantify the dynamical spreading of quantum entanglement and information in the system. Then, we investigate scrambling in the quantum many-body Extended Hubbard Model with external magnetic field Bz and spin-spin coupling J for both uniform and thermal quantum channel inputs and show that it scrambles for specific external tuning parameters (e.g., tunneling amplitudes, on-site potentials, magnetic field). In addition, we compare different Hilbert space sizes (different number of qubits) and show the qualitative and quantitative differences in quantum scrambling as we increase the number of quantum degrees of freedom in the system. Moreover, we find a "scrambling phase transition" for a threshold temperature in the thermal case, that is, the temperature of the model that the channel starts to scramble quantum information. Finally, we make comparisons to the TFI model and highlight the key physical differences between the two systems and mention some future directions of research.

Keywords: condensed matter physics, quantum computing, quantum information theory, quantum physics

Procedia PDF Downloads 99
3982 A Statistical Model for the Geotechnical Parameters of Cement-Stabilised Hightown’s Soft Soil: A Case Stufy of Liverpool, UK

Authors: Hassnen M. Jafer, Khalid S. Hashim, W. Atherton, Ali W. Alattabi

Abstract:

This study investigates the effect of two important parameters (length of curing period and percentage of the added binder) on the strength of soil treated with OPC. An intermediate plasticity silty clayey soil with medium organic content was used in this study. This soft soil was treated with different percentages of a commercially available cement type 32.5-N. laboratory experiments were carried out on the soil treated with 0, 1.5, 3, 6, 9, and 12% OPC by the dry weight to determine the effect of OPC on the compaction parameters, consistency limits, and the compressive strength. Unconfined compressive strength (UCS) test was carried out on cement-treated specimens after exposing them to different curing periods (1, 3, 7, 14, 28, and 90 days). The results of UCS test were used to develop a non-linear multi-regression model to find the relationship between the predicted and the measured maximum compressive strength of the treated soil (qu). The results indicated that there was a significant improvement in the index of plasticity (IP) by treating with OPC; IP was decreased from 20.2 to 14.1 by using 12% of OPC; this percentage was enough to increase the UCS of the treated soil up to 1362 kPa after 90 days of curing. With respect to the statistical model of the predicted qu, the results showed that the regression coefficients (R2) was equal to 0.8534 which indicates a good reproducibility for the constructed model.

Keywords: cement admixtures, soft soil stabilisation, geotechnical parameters, multi-regression model

Procedia PDF Downloads 366
3981 Monitoring Blood Pressure Using Regression Techniques

Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim

Abstract:

Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.

Keywords: blood pressure, noninvasive optical system, principal component analysis, PCA, continuous monitoring

Procedia PDF Downloads 161