Search results for: statistical physics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4391

Search results for: statistical physics

3971 Exercise and Aging Process Related to Oxidative Stress

Authors: B. Dejanova, S. Petrovska, L. Todorovska, J. Pluncevic, S. Mancevska, V. Antevska, E. Sivevska, I. Karagjozova

Abstract:

Introduction: Aging process is mainly related to endothelial function which may be impaired by oxidative stress (OS). Exercise is known to be beneficial to aging process, which may improve health and prevent appearance of chronic diseases in elderly. The aim of the study was to investigate the OS markers related to exercise. Methods: A number of 80 subjects (healthy volunteers) were examined (38 male and 32 female), divided in 3 age groups: group I ≤ 30 years (n=24); group II – 31-50 years (n=24); group III - ≥ 51 year (n=32). Each group was divided to subgroups of sedentary subjects (SS) and subjects who exercise (SE). Group I: SS (n=11), SE (n=13); group II: SS (n=13), SE (n=10); group III: SS (n=23) SE (n=9). Lipid peroxidation (LP) as a fluorimetric method with thiobarbituric acid was used to estimate OS. Antioxidative status was determined by cell antioxidants such as enzymes - superoxide dismutase (SOD), glutathione peroxidase (GPx) and glucose 6 phosphate (G-6-PD); and by extra cell antioxidants such as glutathione reductase (GR), nitric oxide (NO) and total antioxidant capacity (TAC). Results: Increased values of LP were noticed along the aging process: group I – 3.30±0.3 µmol/L; group II – 3.91±0.2 µmol/L; group III – 3.94±0.8 µmol/L (p<0.05), while no statistical significance was found between male and female subjects. Statistical significance for OS was not found between SS and SE in group I as it was found in group II (p<0.05) and in group III (p<0.01). No statistical significance was found for all cell antioxidants and GR within the groups, while NO and TAC showed lower values in SS compared to SE in II (p<0.05) and in group III (p<0.05). Discussion and conclusion: Aging process showed increased OS which may be either due to impaired function of scavengers of free radicals or due to their enormous production. Well balanced exercise might be one of the factors that keep the integrity of blood vessel endothelium which slows down the aging process. Possible mechanism of exercise beneficial influence is shear stress by upregulation of genes coding for nitric oxide bioavailability. Thus, due to obtained results we may conclude that OS is found to be diminished in the subject groups who perform exercise.

Keywords: oxidative stress, aging process, exercise, endothelial function

Procedia PDF Downloads 371
3970 Audit of TPS photon beam dataset for small field output factors using OSLDs against RPC standard dataset

Authors: Asad Yousuf

Abstract:

Purpose: The aim of the present study was to audit treatment planning system beam dataset for small field output factors against standard dataset produced by radiological physics center (RPC) from a multicenter study. Such data are crucial for validity of special techniques, i.e., IMRT or stereotactic radiosurgery. Materials/Method: In this study, multiple small field size output factor datasets were measured and calculated for 6 to 18 MV x-ray beams using the RPC recommend methods. These beam datasets were measured at 10 cm depth for 10 × 10 cm2 to 2 × 2 cm2 field sizes, defined by collimator jaws at 100 cm. The measurements were made with a Landauer’s nanoDot OSLDs whose volume is small enough to gather a full ionization reading even for the 1×1 cm2 field size. At our institute the beam data including output factors have been commissioned at 5 cm depth with an SAD setup. For comparison with the RPC data, the output factors were converted to an SSD setup using tissue phantom ratios. SSD setup also enables coverage of the ion chamber in 2×2 cm2 field size. The measured output factors were also compared with those calculated by Eclipse™ treatment planning software. Result: The measured and calculated output factors are in agreement with RPC dataset within 1% and 4% respectively. The large discrepancies in TPS reflect the increased challenge in converting measured data into a commissioned beam model for very small fields. Conclusion: OSLDs are simple, durable, and accurate tool to verify doses that delivered using small photon beam fields down to a 1x1 cm2 field sizes. The study emphasizes that the treatment planning system should always be evaluated for small field out factors for the accurate dose delivery in clinical setting.

Keywords: small field dosimetry, optically stimulated luminescence, audit treatment, radiological physics center

Procedia PDF Downloads 310
3969 Prospective Cohort Study on Sequential Use of Catheter with Misoprostol vs Misoprostol Alone for Second Trimester Medical Abortion

Authors: Hanna Teklu Gebregziabher

Abstract:

Background: A variety of techniques for medical termination of second-trimester pregnancy can be used, but there is no consensus about which is the best. Even though most evidence suggests the combined use of intracervical Foley catheter and vaginal misoprostol is safe, effective, and acceptable method for termination of second-trimester pregnancy, which is comparable to mifepristone-misoprostol combination regimen with lower cost and no additional maternal risks. The use of mifepristone and misoprostol alone with no other procedure is still the most common procedure in different institutions for 2nd-trimester pregnancy. Methods: A cross-sectional comparative prospective study design is employed on women who were admitted for 2nd-trimester medical abortion and medical abortion failed or if there was no change in cervical status after 24 hours of 1st dose of misoprostol. The study was conducted at St. Paulose Hospital Millennium Medical College. A sample of 44 participants in each arm was necessary to give a two-tailed test, a type 1 error of 5%, 80% statistical power, and a 1:1 ratio among groups. Thus, a total of 94 cases, 47 from each arm, were recruited. Data was entered and cleaned by using Epi-info and analyzed using SPSS version 29.0 statistical software and was presented in descriptive and tabular forms. Different variables were cross-tabulated and compared for significant differences and statistical analysis using the chi-square test and independent t-test, to conclude. Result: There was a significant difference between the two groups on induction to expulsion time and number of doses used. The mean ± SD of induction to expulsion time for those used misoprostol alone was 48.09 ± 11.86 and those who used trans-cervical catheter sequentially with misoprostol were 36.7 ±6.772. Conclusion: The use of a trans-cervical Foley catheter in conjunction with misoprostol in a sequential manner is a more effective, safe, and easily accessible procedure. In addition, the cost of utilizing the catheter is less compared to the cost of misoprostol and is readily available. As a good substitute, we advised using Trans-cervical Catether even for medical abortions performed in the second trimester.

Keywords: second trimester, medical abortion, catheter, misoprostol

Procedia PDF Downloads 25
3968 Simulation of Wind Generator with Fixed Wind Turbine under Matlab-Simulink

Authors: Mahdi Motahari, Mojtaba Farzaneh, Armin Parsian Nejad

Abstract:

The rapidly growing wind industry is highly expressing the need for education and training worldwide, particularly on the system level. Modelling and simulating wind generator system using Matlab-Simulink provides expert help in understanding wind systems engineering and system design. Working under Matlab-Simulink we present the integration of the developed WECS model with public electrical grid. A test of the calculated power and Cp related to the experimental equivalent data, using statistical analysis is performed. The statistical indicators of accuracy show better results of the presented method with RMSE: 21%, 22%, MBE : 0.77%, 0.12 % and MAE :3%, 4%.On the other hand we study its behavior when integrated in whole power system. Three level of wind speeds have been chosen: low with 5m/s as the mean value, medium with 8m/s as the mean value and high speed with 12m/s as the mean value. These allowed predicting and supervising the active power produced by the system, characterized respectively by the middle powers of -150 kW, -250kW and -480 kW which will be injected directly into the public electrical grid and the reactive power, characterized respectively by the middle powers of 60 kW, 180 kW and 320 kW and will be consumed by the wind generator.

Keywords: modelling, simulation, wind generator, fixed speed wind turbine, Matlab-Simulink

Procedia PDF Downloads 607
3967 Nonparametric Path Analysis with Truncated Spline Approach in Modeling Rural Poverty in Indonesia

Authors: Usriatur Rohma, Adji Achmad Rinaldo Fernandes

Abstract:

Nonparametric path analysis is a statistical method that does not rely on the assumption that the curve is known. The purpose of this study is to determine the best nonparametric truncated spline path function between linear and quadratic polynomial degrees with 1, 2, and 3-knot points and to determine the significance of estimating the best nonparametric truncated spline path function in the model of the effect of population migration and agricultural economic growth on rural poverty through the variable unemployment rate using the t-test statistic at the jackknife resampling stage. The data used in this study are secondary data obtained from statistical publications. The results showed that the best model of nonparametric truncated spline path analysis is quadratic polynomial degree with 3-knot points. In addition, the significance of the best-truncated spline nonparametric path function estimation using jackknife resampling shows that all exogenous variables have a significant influence on the endogenous variables.

Keywords: nonparametric path analysis, truncated spline, linear, quadratic, rural poverty, jackknife resampling

Procedia PDF Downloads 20
3966 Parameter Estimation for the Oral Minimal Model and Parameter Distinctions Between Obese and Non-obese Type 2 Diabetes

Authors: Manoja Rajalakshmi Aravindakshana, Devleena Ghosha, Chittaranjan Mandala, K. V. Venkateshb, Jit Sarkarc, Partha Chakrabartic, Sujay K. Maity

Abstract:

Oral Glucose Tolerance Test (OGTT) is the primary test used to diagnose type 2 diabetes mellitus (T2DM) in a clinical setting. Analysis of OGTT data using the Oral Minimal Model (OMM) along with the rate of appearance of ingested glucose (Ra) is performed to study differences in model parameters for control and T2DM groups. The differentiation of parameters of the model gives insight into the behaviour and physiology of T2DM. The model is also studied to find parameter differences among obese and non-obese T2DM subjects and the sensitive parameters were co-related to the known physiological findings. Sensitivity analysis is performed to understand changes in parameter values with model output and to support the findings, appropriate statistical tests are done. This seems to be the first preliminary application of the OMM with obesity as a distinguishing factor in understanding T2DM from estimated parameters of insulin-glucose model and relating the statistical differences in parameters to diabetes pathophysiology.

Keywords: oral minimal model, OGTT, obese and non-obese T2DM, mathematical modeling, parameter estimation

Procedia PDF Downloads 79
3965 An Application of Sinc Function to Approximate Quadrature Integrals in Generalized Linear Mixed Models

Authors: Altaf H. Khan, Frank Stenger, Mohammed A. Hussein, Reaz A. Chaudhuri, Sameera Asif

Abstract:

This paper discusses a novel approach to approximate quadrature integrals that arise in the estimation of likelihood parameters for the generalized linear mixed models (GLMM) as well as Bayesian methodology also requires computation of multidimensional integrals with respect to the posterior distributions in which computation are not only tedious and cumbersome rather in some situations impossible to find solutions because of singularities, irregular domains, etc. An attempt has been made in this work to apply Sinc function based quadrature rules to approximate intractable integrals, as there are several advantages of using Sinc based methods, for example: order of convergence is exponential, works very well in the neighborhood of singularities, in general quite stable and provide high accurate and double precisions estimates. The Sinc function based approach seems to be utilized first time in statistical domain to our knowledge, and it's viability and future scopes have been discussed to apply in the estimation of parameters for GLMM models as well as some other statistical areas.

Keywords: generalized linear mixed model, likelihood parameters, qudarature, Sinc function

Procedia PDF Downloads 381
3964 Mapping of Urban Green Spaces Towards a Balanced Planning in a Coastal Landscape

Authors: Rania Ajmi, Faiza Allouche Khebour, Aude Nuscia Taibi, Sirine Essasi

Abstract:

Urban green spaces (UGS) as an important contributor can be a significant part of sustainable development. A spatial method was employed to assess and map the spatial distribution of UGS in five districts in Sousse, Tunisia. Ecological management of UGS is an essential factor for the sustainable development of the city; hence the municipality of Sousse has decided to support the districts according to different green spaces characters. And to implement this policy, (1) a new GIS web application was developed, (2) then the implementation of the various green spaces was carried out, (3) a spatial mapping of UGS using Quantum GIS was realized, and (4) finally a data processing and statistical analysis with RStudio programming language was executed. The intersection of the results of the spatial and statistical analyzes highlighted the presence of an imbalance in terms of the spatial UGS distribution in the study area. The discontinuity between the coast and the city's green spaces was not designed in a spirit of network and connection, hence the lack of a greenway that connects these spaces to the city. Finally, this GIS support will be used to assess and monitor green spaces in the city of Sousse by decision-makers and will contribute to improve the well-being of the local population.

Keywords: distributions, GIS, green space, imbalance, spatial analysis

Procedia PDF Downloads 180
3963 Effects of a Student-Centered Approach to Assessment on Students' Attitudes towards 'Applied Statistics' Course

Authors: Anduela Lile

Abstract:

The purpose of this cross sectional study was to investigate the effectiveness of teaching and learning Statistics from a student centered perspective in higher education institutions. Statistics education has emphasized the application of tangible and interesting examples in order to motivate students learning about statistical concepts. Participants in this study were 112 bachelor students enrolled in the ‘Applied Statistics’ course in Sports University of Tirana. Experimental group students received a student-centered teaching approach; Control group students received an instructor-centered teaching approach. This study found student-centered approach student group had statistically significantly higher assessments scores (52.1 ± 18.9) at the end of the evaluation compared to instructor-centered approach student group (61.8 ± 16.4), (t (108) = 2.848, p = 0.005). Results concluded that student-centered perspective can improve student positive attitude to statistical methods and to motivate project work. Therefore, findings of this study may be very useful to the higher education institutions to establish their learning strategies especially for courses related to Statistics.

Keywords: student-centered, instructor-centered, course assessment, learning outcomes, applied statistics

Procedia PDF Downloads 262
3962 Exploring Students' Alternative Conception in Vector Components

Authors: Umporn Wutchana

Abstract:

An open ended problem and unstructured interview had been used to explore students’ conceptual and procedural understanding of vector components. The open ended problem had been designed based on research instrument used in previous physics education research. Without physical context, we asked students to find out magnitude and draw graphical form of vector components. The open ended problem was given to 211 first year students of faculty of science during the third (summer) semester in 2014 academic year. The students spent approximately 15 minutes of their second time of the General Physics I course to complete the open ended problem after they had failed. Consequently, their responses were classified based on the similarity of errors performed in the responses. Then, an unstructured interview was conducted. 7 students were randomly selected and asked to reason and explain their answers. The study results showed that 53% of 211 students provided correct numerical magnitude of vector components while 10.9% of them confused and punctuated the magnitude of vectors in x- with y-components. Others 20.4% provided just symbols and the last 15.6% gave no answer. When asking to draw graphical form of vector components, only 10% of 211 students made corrections. A majority of them produced errors and revealed alternative conceptions. 46.5% drew longer and/or shorter magnitude of vector components. 43.1% drew vectors in different forms or wrote down other symbols. Results from the unstructured interview indicated that some students just memorized the method to get numerical magnitude of x- and y-components. About graphical form of component vectors, some students though that the length of component vectors should be shorter than those of the given one. So then, it could be combined to be equal length of the given vectors while others though that component vectors should has the same length as the given vectors. It was likely to be that many students did not develop a strong foundation of understanding in vector components but just learn by memorizing its solution or the way to compute its magnitude and attribute little meaning to such concept.

Keywords: graphical vectors, vectors, vector components, misconceptions, alternative conceptions

Procedia PDF Downloads 172
3961 Surface Quality Improvement of Abrasive Waterjet Cutting for Spacecraft Structure

Authors: Tarek M. Ahmed, Ahmed S. El Mesalamy, Amro M. Youssef, Tawfik T. El Midany

Abstract:

Abrasive waterjet (AWJ) machining is considered as one of the most powerful cutting processes. It can be used for cutting heat sensitive, hard and reflective materials. Aluminum 2024 is a high-strength alloy which is widely used in aerospace and aviation industries. This paper aims to improve aluminum alloy and to investigate the effect of AWJ control parameters on surface geometry quality. Design of experiments (DoE) is used for establishing an experimental matrix. Statistical modeling is used to present a relation between the cutting parameters (pressure, speed, and distance between the nozzle and cut surface) and responses (taper angle and surface roughness). The results revealed a tangible improvement in productivity by using AWJ processing. The taper kerf angle can be improved by decreasing standoff distance and speed and increasing water pressure. While decreasing (cutting speed, pressure and distance between the nozzle and cut surface) improve the surface roughness in the operating window of cutting parameters.

Keywords: abrasive waterjet machining, machining of aluminum alloy, non-traditional cutting, statistical modeling

Procedia PDF Downloads 239
3960 Prevalence of Breast Cancer Molecular Subtypes at a Tertiary Cancer Institute

Authors: Nahush Modak, Meena Pangarkar, Anand Pathak, Ankita Tamhane

Abstract:

Background: Breast cancer is the prominent cause of cancer and mortality among women. This study was done to show the statistical analysis of a cohort of over 250 patients detected with breast cancer diagnosed by oncologists using Immunohistochemistry (IHC). IHC was performed by using ER; PR; HER2; Ki-67 antibodies. Materials and methods: Formalin fixed Paraffin embedded tissue samples were obtained by surgical manner and standard protocol was followed for fixation, grossing, tissue processing, embedding, cutting and IHC. The Ventana Benchmark XT machine was used for automated IHC of the samples. Antibodies used were supplied by F. Hoffmann-La Roche Ltd. Statistical analysis was performed by using SPSS for windows. Statistical tests performed were chi-squared test and Correlation tests with p<.01. The raw data was collected and provided by National Cancer Insitute, Jamtha, India. Result: Luminal B was the most prevailing molecular subtype of Breast cancer at our institute. Chi squared test of homogeneity was performed to find equality in distribution and Luminal B was the most prevalent molecular subtype. The worse prognostic indicator for breast cancer depends upon expression of Ki-67 and her2 protein in cancerous cells. Our study was done at p <.01 and significant dependence was observed. There exists no dependence of age on molecular subtype of breast cancer. Similarly, age is an independent variable while considering Ki-67 expression. Chi square test performed on Human epidermal growth factor receptor 2 (HER2) statuses of patients and strong dependence was observed in percentage of Ki-67 expression and Her2 (+/-) character which shows that, value of Ki depends upon Her2 expression in cancerous cells (p<.01). Surprisingly, dependence was observed in case of Ki-67 and Pr, at p <.01. This shows that Progesterone receptor proteins (PR) are over-expressed when there is an elevation in expression of Ki-67 protein. Conclusion: We conclude from that Luminal B is the most prevalent molecular subtype at National Cancer Institute, Jamtha, India. There was found no significant correlation between age and Ki-67 expression in any molecular subtype. And no dependence or correlation exists between patients’ age and molecular subtype. We also found that, when the diagnosis is Luminal A, out of the cohort of 257 patients, no patient shows >14% Ki-67 value. Statistically, extremely significant values were observed for dependence of PR+Her2- and PR-Her2+ scores on Ki-67 expression. (p<.01). Her2 is an important prognostic factor in breast cancer. Chi squared test for Her2 and Ki-67 shows that the expression of Ki depends upon Her2 statuses. Moreover, Ki-67 cannot be used as a standalone prognostic factor for determining breast cancer.

Keywords: breast cancer molecular subtypes , correlation, immunohistochemistry, Ki-67 and HR, statistical analysis

Procedia PDF Downloads 110
3959 Characterization of Climatic Drought in the Saiss Plateau (Morocco) Using Statistical Indices

Authors: Abdeghani Qadem

Abstract:

Climate change is now an undeniable reality with increasing impacts on water systems worldwide, especially leading to severe drought episodes. The Southern Mediterranean region is particularly affected by this drought, which can have devastating consequences on water resources. Morocco, due to its geographical location in North Africa and the Southern Mediterranean, is especially vulnerable to these effects of climate change, particularly drought. In this context, this article focuses on the study of climate variability and drought characteristics in the Saiss Plateau region and its adjacent areas with the Middle Atlas, using specific statistical indices. The study begins by analyzing the annual precipitation variation, with a particular emphasis on data homogenization and gap filling using a regional vector. Then, the analysis delves into drought episodes in the region, using the Standardized Precipitation Index (SPI) over a 12-month period. The central objective is to accurately assess significant drought changes between 1980 and 2015, based on data collected from nine meteorological stations located in the study area.

Keywords: climate variability, regional vector, drought, standardized precipitation index, Saiss Plateau, middle atlas

Procedia PDF Downloads 49
3958 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan

Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail

Abstract:

Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.

Keywords: credibility, decision making, food bloggers, generation z, e-wom

Procedia PDF Downloads 58
3957 An Investigation of Surface Water Quality in an Industrial Area Using Integrated Approaches

Authors: Priti Saha, Biswajit Paul

Abstract:

Rapid urbanization and industrialization has increased the pollution load in surface water bodies. However, these water bodies are major source of water for drinking, irrigation, industrial activities and fishery. Therefore, water quality assessment is paramount importance to evaluate its suitability for all these purposes. This study focus to evaluate the surface water quality of an industrial city in eastern India through integrating interdisciplinary techniques. The multi-purpose Water Quality Index (WQI) assess the suitability for drinking, irrigation as well as fishery of forty-eight sampling locations, where 8.33% have excellent water quality (WQI:0-25) for fishery and 10.42%, 20.83% and 45.83% have good quality (WQI:25-50), which represents its suitability for drinking irrigation and fishery respectively. However, the industrial water quality was assessed through Ryznar Stability Index (LSI), which affirmed that only 6.25% of sampling locations have neither corrosive nor scale forming properties (RSI: 6.2-6.8). Integration of these statistical analysis with geographical information system (GIS) helps in spatial assessment. It identifies of the regions where the water quality is suitable for its use in drinking, irrigation, fishery as well as industrial activities. This research demonstrates the effectiveness of statistical and GIS techniques for water quality assessment.

Keywords: surface water, water quality assessment, water quality index, spatial assessment

Procedia PDF Downloads 161
3956 An Appraisal of Maintenance Management Practices in Federal University Dutse and Jigawa State Polytechnic Dutse, Nigeria

Authors: Aminu Mubarak Sadis

Abstract:

This study appraised the maintenance management practice in Federal University Dutse and Jigawa State Polytechnic Dutse, in Nigeria. The Physical Planning, Works and Maintenance Departments of the two Higher Institutions (Federal University Dutse and Jigawa State Polytechnic) are responsible for production and maintenance management of their physical assets. Over–enrollment problem has been a common feature in the higher institutions in Nigeria, Data were collected by the administered questionnaires and subsequent oral interview to authenticate the completed questionnaires. Random sampling techniques was used in selecting 150 respondents across the various institutions (Federal University Dutse and Jigawa State Polytechnic Dutse). Data collected was analyzed using Statistical Package for Social Science (SPSS) and t-test statistical techniques The conclusion was that maintenance management activities are yet to be given their appropriate attention on functions of the university and polytechnic which are crucial to improving teaching, learning and research. The unit responsible for maintenance and managing facilities should focus on their stated functions and effect changes were possible.

Keywords: appraisal, maintenance management, university, Polytechnic, practices

Procedia PDF Downloads 227
3955 The Concept of Neurostatistics as a Neuroscience

Authors: Igwenagu Chinelo Mercy

Abstract:

This study is on the concept of Neurostatistics in relation to neuroscience. Neuroscience also known as neurobiology is the scientific study of the nervous system. In the study of neuroscience, it has been noted that brain function and its relations to the process of acquiring knowledge and behaviour can be better explained by the use of various interrelated methods. The scope of neuroscience has broadened over time to include different approaches used to study the nervous system at different scales. On the other hand, Neurostatistics based on this study is viewed as a statistical concept that uses similar techniques of neuron mechanisms to solve some problems especially in the field of life science. This study is imperative in this era of Artificial intelligence/Machine leaning in the sense that clear understanding of the technique and its proper application could assist in solving some medical disorder that are mainly associated with the nervous system. This will also help in layman’s understanding of the technique of the nervous system in order to overcome some of the health challenges associated with it. For this concept to be well understood, an illustrative example using a brain associated disorder was used for demonstration. Structural equation modelling was adopted in the analysis. The results clearly show the link between the techniques of statistical model and nervous system. Hence, based on this study, the appropriateness of Neurostatistics application in relation to neuroscience could be based on the understanding of the behavioural pattern of both concepts.

Keywords: brain, neurons, neuroscience, neurostatistics, structural equation modeling

Procedia PDF Downloads 54
3954 Quantum Mechanics as A Limiting Case of Relativistic Mechanics

Authors: Ahmad Almajid

Abstract:

The idea of unifying quantum mechanics with general relativity is still a dream for many researchers, as physics has only two paths, no more. Einstein's path, which is mainly based on particle mechanics, and the path of Paul Dirac and others, which is based on wave mechanics, the incompatibility of the two approaches is due to the radical difference in the initial assumptions and the mathematical nature of each approach. Logical thinking in modern physics leads us to two problems: - In quantum mechanics, despite its success, the problem of measurement and the problem of wave function interpretation is still obscure. - In special relativity, despite the success of the equivalence of rest-mass and energy, but at the speed of light, the fact that the energy becomes infinite is contrary to logic because the speed of light is not infinite, and the mass of the particle is not infinite too. These contradictions arise from the overlap of relativistic and quantum mechanics in the neighborhood of the speed of light, and in order to solve these problems, one must understand well how to move from relativistic mechanics to quantum mechanics, or rather, to unify them in a way different from Dirac's method, in order to go along with God or Nature, since, as Einstein said, "God doesn't play dice." From De Broglie's hypothesis about wave-particle duality, Léon Brillouin's definition of the new proper time was deduced, and thus the quantum Lorentz factor was obtained. Finally, using the Euler-Lagrange equation, we come up with new equations in quantum mechanics. In this paper, the two problems in modern physics mentioned above are solved; it can be said that this new approach to quantum mechanics will enable us to unify it with general relativity quite simply. If the experiments prove the validity of the results of this research, we will be able in the future to transport the matter at speed close to the speed of light. Finally, this research yielded three important results: 1- Lorentz quantum factor. 2- Planck energy is a limited case of Einstein energy. 3- Real quantum mechanics, in which new equations for quantum mechanics match and exceed Dirac's equations, these equations have been reached in a completely different way from Dirac's method. These equations show that quantum mechanics is a limited case of relativistic mechanics. At the Solvay Conference in 1927, the debate about quantum mechanics between Bohr, Einstein, and others reached its climax, while Bohr suggested that if particles are not observed, they are in a probabilistic state, then Einstein said his famous claim ("God does not play dice"). Thus, Einstein was right, especially when he didn't accept the principle of indeterminacy in quantum theory, although experiments support quantum mechanics. However, the results of our research indicate that God really does not play dice; when the electron disappears, it turns into amicable particles or an elastic medium, according to the above obvious equations. Likewise, Bohr was right also, when he indicated that there must be a science like quantum mechanics to monitor and study the motion of subatomic particles, but the picture in front of him was blurry and not clear, so he resorted to the probabilistic interpretation.

Keywords: lorentz quantum factor, new, planck’s energy as a limiting case of einstein’s energy, real quantum mechanics, new equations for quantum mechanics

Procedia PDF Downloads 62
3953 The Trend of Injuries in Building Fire in Tehran from 2002 to 2012

Authors: Mohammadreza Ashouri, Majid Bayatian

Abstract:

Analysis of fire data is a way for the implementation of any plan to improve the level of safety in cities. Such an analysis is able to reveal signs of changes in a given period and can be used as a measure of safety. The information of about 66,341 fires (from 2002 to 2012) released by Tehran Safety Services and Fire-Fighting Organization and data on the population and the number of households provided by Tehran Municipality and the Statistical Yearbook of Iran were extracted. Using the data, the fire changes, the rate of injuries, and mortality rate were determined and analyzed. The rate of injuries and mortality rate of fires per one million population of Tehran were 59.58% and 86.12%, respectively. During the study period, the number of fires and fire stations increased by 104.38% and 102.63%, respectively. Most fires (9.21%) happened in the 4th District of Tehran. The results showed that the recorded fire data have not been systematically planned for fire prevention since one of the ways to reduce injuries caused by fires is to develop a systematic plan for necessary actions in emergency situations. To determine a reliable source for fire prevention, the stages, definitions of working processes and the cause and effect chains should be considered. Therefore, a comprehensive statistical system should be developed for reported and recorded fire data.

Keywords: fire statistics, fire analysis, accident prevention, Tehran

Procedia PDF Downloads 169
3952 Reduction of Defects Using Seven Quality Control Tools for Productivity Improvement at Automobile Company

Authors: Abdul Sattar Jamali, Imdad Ali Memon, Maqsood Ahmed Memon

Abstract:

Quality of production near to zero defects is an objective of every manufacturing and service organization. In order to maintain and improve the quality by reduction in defects, Statistical tools are being used by any organizations. There are many statistical tools are available to assess the quality. Keeping in view the importance of many statistical tools, traditional 7QC tools has been used in any manufacturing and automobile Industry. Therefore, the 7QC tools have been successfully applied at one of the Automobile Company Pakistan. Preliminary survey has been done for the implementation of 7QC tool in the assembly line of Automobile Industry. During preliminary survey two inspection points were decided to collect the data, which are Chassis line and trim line. The data for defects at Chassis line and trim line were collected for reduction in defects which ultimately improve productivity. Every 7QC tools has its benefits observed from the results. The flow charts developed for better understanding about inspection point for data collection. The check sheets developed for helps for defects data collection. Histogram represents the severity level of defects. Pareto charts show the cumulative effect of defects. The Cause and Effect diagrams developed for finding the root causes of each defects. Scatter diagram developed the relation of defects increasing or decreasing. The P-Control charts developed for showing out of control points beyond the limits for corrective actions. The successful implementation of 7QC tools at the inspection points at Automobile Industry concluded that the considerable amount of reduction on defects level, as in Chassis line from 132 defects to 13 defects. The total 90% defects were reduced in Chassis Line. In Trim line defects were reduced from 157 defects to 28 defects. The total 82% defects were reduced in Trim Line. As the Automobile Company exercised only few of the 7 QC tools, not fully getting the fruits by the application of 7 QC tools. Therefore, it is suggested the company may need to manage a mechanism for the application of 7 QC tools at every section.

Keywords: check sheet, cause and effect diagram, control chart, histogram

Procedia PDF Downloads 307
3951 Statistical Analysis of the Impact of Maritime Transport Gross Domestic Product (GDP) on Nigeria’s Economy

Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi

Abstract:

Nigeria is referred as the ‘Giant of Africa’ due to high population, land mass and large economy. However, it still trails far behind many smaller economies in the continent in terms of maritime operations. As we have seen that the maritime industry is the spark plug for national growth, because it houses the most crucial infrastructure that generates wealth for a nation, it is worrisome that a nation with six seaports lag in maritime activities. In this research, we have studied how the Gross Domestic Product (GDP) of the maritime transport influences the Nigerian economy. To do this, we applied Simple Linear Regression (SLR), Support Vector Machine (SVM), Polynomial Regression Model (PRM), Generalized Additive Model (GAM) and Generalized Linear Mixed Model (GLMM) to model the relationship between the nation’s Total GDP (TGDP) and the Maritime Transport GDP (MGDP) using a time series data of 20 years. The result showed that the MGDP is statistically significant to the Nigerian economy. Amongst the statistical tool applied, the PRM of order 4 describes the relationship better when compared to other methods. The recommendations presented in this study will guide policy makers and help improve the economy of Nigeria in terms of its GDP.

Keywords: maritime transport, economy, GDP, regression, port

Procedia PDF Downloads 135
3950 Remarks on the Lattice Green's Function for the Anisotropic Face Cantered Cubic Lattice

Authors: Jihad H. Asad

Abstract:

An expression for the Green’s function (GF) of anisotropic face cantered cubic (IFCC) lattice is evaluated analytically and numerically for a single impurity problem. The density of states (DOS), phase shift and scattering cross section are expressed in terms of complete elliptic integrals of the first kind.

Keywords: lattice Green's function, elliptic integral, physics, cubic lattice

Procedia PDF Downloads 453
3949 A Statistical Approach to Air Pollution in Mexico City and It's Impacts on Well-Being

Authors: Ana B. Carrera-Aguilar , Rodrigo T. Sepulveda-Hirose, Diego A. Bernal-Gurrusquieta, Francisco A. Ramirez Casas

Abstract:

In recent years, Mexico City has presented high levels of atmospheric pollution; the city is also an example of inequality and poverty that impact metropolitan areas around the world. This combination of social and economic exclusion, coupled with high levels of pollution evidence the loss of well-being among the population. The effect of air pollution on quality of life is an area of study that has been overlooked. The purpose of this study is to find relations between air quality and quality of life in Mexico City through statistical analysis of a regression model and principal component analysis of several atmospheric contaminants (CO, NO₂, ozone, particulate matter, SO₂) and well-being indexes (HDI, poverty, inequality, life expectancy and health care index). The data correspond to official information (INEGI, SEDEMA, and CEPAL) for 2000-2018. Preliminary results show that the Human Development Index (HDI) is affected by the impacts of pollution, and its indicators are reduced in the presence of contaminants. It is necessary to promote a strong interest in this issue in Mexico City. Otherwise, the problem will not only remain but will worsen affecting those who have less and the population well-being in a generalized way.

Keywords: air quality, Mexico City, quality of life, statistics

Procedia PDF Downloads 126
3948 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images

Authors: Jameela Ali Alkrimi, Abdul Rahim Ahmad, Azizah Suliman, Loay E. George

Abstract:

Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. The lack of RBCs is a condition characterized by lower than normal hemoglobin level; this condition is referred to as 'anemia'. In this study, a software was developed to isolate RBCs by using a machine learning approach to classify anemic RBCs in microscopic images. Several features of RBCs were extracted using image processing algorithms, including principal component analysis (PCA). With the proposed method, RBCs were isolated in 34 second from an image containing 18 to 27 cells. We also proposed that PCA could be performed to increase the speed and efficiency of classification. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network ANN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained for a short time period with more efficient when PCA was used.

Keywords: red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC

Procedia PDF Downloads 387
3947 Tailoring Quantum Oscillations of Excitonic Schrodinger’s Cats as Qubits

Authors: Amit Bhunia, Mohit Kumar Singh, Maryam Al Huwayz, Mohamed Henini, Shouvik Datta

Abstract:

We report [https://arxiv.org/abs/2107.13518] experimental detection and control of Schrodinger’s Cat like macroscopically large, quantum coherent state of a two-component Bose-Einstein condensate of spatially indirect electron-hole pairs or excitons using a resonant tunneling diode of III-V Semiconductors. This provides access to millions of excitons as qubits to allow efficient, fault-tolerant quantum computation. In this work, we measure phase-coherent periodic oscillations in photo-generated capacitance as a function of an applied voltage bias and light intensity over a macroscopically large area. Periodic presence and absence of splitting of excitonic peaks in the optical spectra measured by photocapacitance point towards tunneling induced variations in capacitive coupling between the quantum well and quantum dots. Observation of negative ‘quantum capacitance’ due to a screening of charge carriers by the quantum well indicates Coulomb correlations of interacting excitons in the plane of the sample. We also establish that coherent resonant tunneling in this well-dot heterostructure restricts the available momentum space of the charge carriers within this quantum well. Consequently, the electric polarization vector of the associated indirect excitons collective orients along the direction of applied bias and these excitons undergo Bose-Einstein condensation below ~100 K. Generation of interference beats in photocapacitance oscillation even with incoherent white light further confirm the presence of stable, long-range spatial correlation among these indirect excitons. We finally demonstrate collective Rabi oscillations of these macroscopically large, ‘multipartite’, two-level, coupled and uncoupled quantum states of excitonic condensate as qubits. Therefore, our study not only brings the physics and technology of Bose-Einstein condensation within the reaches of semiconductor chips but also opens up experimental investigations of the fundamentals of quantum physics using similar techniques. Operational temperatures of such two-component excitonic BEC can be raised further with a more densely packed, ordered array of QDs and/or using materials having larger excitonic binding energies. However, fabrications of single crystals of 0D-2D heterostructures using 2D materials (e.g. transition metal di-chalcogenides, oxides, perovskites etc.) having higher excitonic binding energies are still an open challenge for semiconductor optoelectronics. As of now, these 0D-2D heterostructures can already be scaled up for mass production of miniaturized, portable quantum optoelectronic devices using the existing III-V and/or Nitride based semiconductor fabrication technologies.

Keywords: exciton, Bose-Einstein condensation, quantum computation, heterostructures, semiconductor Physics, quantum fluids, Schrodinger's Cat

Procedia PDF Downloads 173
3946 Local Spectrum Feature Extraction for Face Recognition

Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh

Abstract:

This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.

Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret

Procedia PDF Downloads 645
3945 Econophysical Approach on Predictability of Financial Crisis: The 2001 Crisis of Turkey and Argentina Case

Authors: Arzu K. Kamberli, Tolga Ulusoy

Abstract:

Technological developments and the resulting global communication have made the 21st century when large capitals are moved from one end to the other via a button. As a result, the flow of capital inflows has accelerated, and capital inflow has brought with it crisis-related infectiousness. Considering the irrational human behavior, the financial crisis in the world under the influence of the whole world has turned into the basic problem of the countries and increased the interest of the researchers in the reasons of the crisis and the period in which they lived. Therefore, the complex nature of the financial crises and its linearly unexplained structure have also been included in the new discipline, econophysics. As it is known, although financial crises have prediction mechanisms, there is no definite information. In this context, in this study, using the concept of electric field from the electrostatic part of physics, an early econophysical approach for global financial crises was studied. The aim is to define a model that can take place before the financial crises, identify financial fragility at an earlier stage and help public and private sector members, policy makers and economists with an econophysical approach. 2001 Turkey crisis has been assessed with data from Turkish Central Bank which is covered between 1992 to 2007, and for 2001 Argentina crisis, data was taken from IMF and the Central Bank of Argentina from 1997 to 2007. As an econophysical method, an analogy is used between the Gauss's law used in the calculation of the electric field and the forecasting of the financial crisis. The concept of Φ (Financial Flux) has been adopted for the pre-warning of the crisis by taking advantage of this analogy, which is based on currency movements and money mobility. For the first time used in this study Φ (Financial Flux) calculations obtained by the formula were analyzed by Matlab software, and in this context, in 2001 Turkey and Argentina Crisis for Φ (Financial Flux) crisis of values has been confirmed to give pre-warning.

Keywords: econophysics, financial crisis, Gauss's Law, physics

Procedia PDF Downloads 143
3944 Analysis of Organizational Factors Effect on Performing Electronic Commerce Strategy: A Case Study of the Namakin Food Industry

Authors: Seyed Hamidreza Hejazi Dehghani, Neda Khounsari

Abstract:

Quick growth of electronic commerce in developed countries means that developing nations must change in their commerce strategies fundamentally. Most organizations are aware of the impact of the Internet and e-Commerce on the future of their firm, and thus, they have to focus on organizational factors that have an effect on the deployment of an e-Commerce strategy. In this situation, it is essential to identify organizational factors such as the organizational culture, human resources, size, structure and product/service that impact an e-commerce strategy. Accordingly, this research specifies the effects of organizational factors on applying an e-commerce strategy in the Namakin food industry. The statistical population of this research is 95 managers and employees. Cochran's formula is used for determination of the sample size that is 77 of the statistical population. Also, SPSS and Smart PLS software were utilized for analyzing the collected data. The results of hypothesis testing show that organizational factors have positive and significant effects of applying an e-Commerce strategy. On the other hand, sub-hypothesizes show that effectiveness of the organizational culture and size criteria were rejected and other sub-hypothesis were accepted.

Keywords: electronic commerce, organizational factors, attitude of managers, organizational readiness

Procedia PDF Downloads 261
3943 AI Peer Review Challenge: Standard Model of Physics vs 4D GEM EOS

Authors: David A. Harness

Abstract:

Natural evolution of ATP cognitive systems is to meet AI peer review standards. ATP process of axiom selection from Mizar to prove a conjecture would be further refined, as in all human and machine learning, by solving the real world problem of the proposed AI peer review challenge: Determine which conjecture forms the higher confidence level constructive proof between Standard Model of Physics SU(n) lattice gauge group operation vs. present non-standard 4D GEM EOS SU(n) lattice gauge group spatially extended operation in which the photon and electron are the first two trace angular momentum invariants of a gravitoelectromagnetic (GEM) energy momentum density tensor wavetrain integration spin-stress pressure-volume equation of state (EOS), initiated via 32 lines of Mathematica code. Resulting gravitoelectromagnetic spectrum ranges from compressive through rarefactive of the central cosmological constant vacuum energy density in units of pascals. Said self-adjoint group operation exclusively operates on the stress energy momentum tensor of the Einstein field equations, introducing quantization directly on the 4D spacetime level, essentially reformulating the Yang-Mills virtual superpositioned particle compounded lattice gauge groups quantization of the vacuum—into a single hyper-complex multi-valued GEM U(1) × SU(1,3) lattice gauge group Planck spacetime mesh quantization of the vacuum. Thus the Mizar corpus already contains all of the axioms required for relevant DeepMath premise selection and unambiguous formal natural language parsing in context deep learning.

Keywords: automated theorem proving, constructive quantum field theory, information theory, neural networks

Procedia PDF Downloads 158
3942 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014

Authors: Alexiou Dimitra, Fragkaki Maria

Abstract:

The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.

Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics

Procedia PDF Downloads 498