Search results for: probability and statistics
2680 Application of Logistics Regression Model to Ascertain the Determinants of Food Security among Households in Maiduguri, Metropolis, Borno State, Nigeria
Authors: Abdullahi Yahaya Musa, Harun Rann Bakari
Abstract:
The study examined the determinants of food security among households in Maiduguri, Metropolis, Borno State, Nigeria. The objectives of the study are to: examine the determinants of food security among households; identify the coping strategies employed by food-insecure households in Maiduguri, Metropolis, Borno State, Nigeria. The population of the study is 843,964 respondents out of which 400 respondents were sampled. The study used a self-developed questionnaire to collect data from four hundred (400) respondents. Four hundred (400) copies of questionnaires were administered and all were retrieved, making 100% return rate. The study employed descriptive and inferential statistics for data analysis. Descriptive statistics (frequency counts and percentages) was used to analyze the socio-economic characteristics of the respondents and objective four, while inferential statistics (logit regression analysis) was used to analyze one. Four hundred (400) copies of questionnaires were administered and all the four hundred (400) were retrieved, making a 100% return rate. The results were presented in tables and discussed according to the research objectives. The study revealed that HHA, HHE, HHSZ, HHSX, HHAS, HHI, HHFS, HHFE, HHAC and HHCDR were the determinants of food security in Maiduguri Metropolis. Relying on less preferred foods, purchasing food on credit, limiting food intake to ensure children get enough, borrowing money to buy foodstuffs, relying on help from relatives or friends outside the household, adult family members skipping or reducing a meal because of insufficient finances and ration money to household members to buy street food were the coping strategies employed by food-insecure households in Maiduguri metropolis. The study recommended that Nigeria Government should intensify the fight against the Boko haram insurgency. This will put an end to Boko Haram Insurgency and enable farmers to return to farming in Borno state.Keywords: internally displaced persons, food security, coping strategies, descriptive statistics, logistics regression model, odd ratio
Procedia PDF Downloads 1452679 Gender Dimension of Migrations Influenced by Genocide and Feminicides around the Globe
Authors: Lejla Mušić
Abstract:
Gender dimension of migration analyzes the intersection in between the world statistics on male and female migrations, around the world, involving the questions of youth migrations. Comparative analyses of world migration statistics as methodology offer the insight into the position of women in labor market around world. There are different forms of youth debris in contemporary world. The main problems are illegal migration, feminization of poverty, kidnapping the girls in Nigeria, femicides in Juarez and Mexico. Illegal migrations involve forced labor, rape and prostitution. Transgender youth share ideas through the online media (anti-bullying videos) and develop their own styles such as anarcho-punk, rave, or rock. Therefore, the stronger gender equality laws and laws for protection of women on work should be enforced.Keywords: hyperfeminisation, rape, gangs of girls, rent boys masculinities, Varoç in Istanbul, forced labor, rape and prostitution, illegal emigrations
Procedia PDF Downloads 2552678 Investigation of the Main Trends of Tourist Expenses in Georgia
Authors: Nino Abesadze, Marine Mindorashvili, Nino Paresashvili
Abstract:
The main purpose of the article is to make complex statistical analysis of tourist expenses of foreign visitors. We used mixed technique of selection that implies rules of random and proportional selection. Computer software SPSS was used to compute statistical data for corresponding analysis. Corresponding methodology of tourism statistics was implemented according to international standards. Important information was collected and grouped from the major Georgian airports. Techniques of statistical observation were prepared. A representative population of foreign visitors and a rule of selection of respondents were determined. We have a trend of growth of tourist numbers and share of tourists from post-soviet countries constantly increases. Level of satisfaction with tourist facilities and quality of service has grown, but still we have a problem of disparity between quality of service and prices. The design of tourist expenses of foreign visitors is diverse; competitiveness of tourist products of Georgian tourist companies is higher.Keywords: tourist, expenses, methods, statistics, analysis
Procedia PDF Downloads 3362677 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference
Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade
Abstract:
In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory
Procedia PDF Downloads 862676 Assessing and Identifying Factors Affecting Customers Satisfaction of Commercial Bank of Ethiopia: The Case of West Shoa Zone (Bako, Gedo, Ambo, Ginchi and Holeta), Ethiopia
Authors: Habte Tadesse Likassa, Bacha Edosa
Abstract:
Customer’s satisfaction was very important thing that is required for the existence of banks to be more productive and success in any organization and business area. The main goal of the study is assessing and identifying factors that influence customer’s satisfaction in West Shoa Zone of Commercial Bank of Ethiopia (Holeta, Ginchi, Ambo, Gedo and Bako). Stratified random sampling procedure was used in the study and by using simple random sampling (lottery method) 520 customers were drawn from the target population. By using Probability Proportional Size Techniques sample size for each branch of banks were allocated. Both descriptive and inferential statistics methods were used in the study. A binary logistic regression model was fitted to see the significance of factors affecting customer’s satisfaction in this study. SPSS statistical package was used for data analysis. The result of the study reveals that the overall level of customer’s satisfaction in the study area is low (38.85%) as compared those who were not satisfied (61.15%). The result of study showed that all most all factors included in the study were significantly associated with customer’s satisfaction. Therefore, it can be concluded that based on the comparison of branches on their customers satisfaction by using odd ratio customers who were using Ambo and Bako are less satisfied as compared to customers who were in Holeta branch. Additionally, customers who were in Ginchi and Gedo were more satisfied than that of customers who were in Holeta. Since the level of customers satisfaction was low in the study area, it is more advisable and recommended for concerned body works cooperatively more in maximizing satisfaction of their customers.Keywords: customers, satisfaction, binary logistic, complain handling process, waiting time
Procedia PDF Downloads 4632675 Case-Based Reasoning for Modelling Random Variables in the Reliability Assessment of Existing Structures
Authors: Francesca Marsili
Abstract:
The reliability assessment of existing structures with probabilistic methods is becoming an increasingly important and frequent engineering task. However probabilistic reliability methods are based on an exhaustive knowledge of the stochastic modeling of the variables involved in the assessment; at the moment standards for the modeling of variables are absent, representing an obstacle to the dissemination of probabilistic methods. The framework according to probability distribution functions (PDFs) are established is represented by the Bayesian statistics, which uses Bayes Theorem: a prior PDF for the considered parameter is established based on information derived from the design stage and qualitative judgments based on the engineer past experience; then, the prior model is updated with the results of investigation carried out on the considered structure, such as material testing, determination of action and structural properties. The application of Bayesian statistics arises two different kind of problems: 1. The results of the updating depend on the engineer previous experience; 2. The updating of the prior PDF can be performed only if the structure has been tested, and quantitative data that can be statistically manipulated have been collected; performing tests is always an expensive and time consuming operation; furthermore, if the considered structure is an ancient building, destructive tests could compromise its cultural value and therefore should be avoided. In order to solve those problems, an interesting research path is represented by investigating Artificial Intelligence (AI) techniques that can be useful for the automation of the modeling of variables and for the updating of material parameters without performing destructive tests. Among the others, one that raises particular attention in relation to the object of this study is constituted by Case-Based Reasoning (CBR). In this application, cases will be represented by existing buildings where material tests have already been carried out and an updated PDFs for the material mechanical parameters has been computed through a Bayesian analysis. Then each case will be composed by a qualitative description of the material under assessment and the posterior PDFs that describe its material properties. The problem that will be solved is the definition of PDFs for material parameters involved in the reliability assessment of the considered structure. A CBR system represent a good candi¬date in automating the modelling of variables because: 1. Engineers already draw an estimation of the material properties based on the experience collected during the assessment of similar structures, or based on similar cases collected in literature or in data-bases; 2. Material tests carried out on structure can be easily collected from laboratory database or from literature; 3. The system will provide the user of a reliable probabilistic description of the variables involved in the assessment that will also serve as a tool in support of the engineer’s qualitative judgments. Automated modeling of variables can help in spreading probabilistic reliability assessment of existing buildings in the common engineering practice, and target at the best intervention and further tests on the structure; CBR represents a technique which may help to achieve this.Keywords: reliability assessment of existing buildings, Bayesian analysis, case-based reasoning, historical structures
Procedia PDF Downloads 3362674 The 10-year Risk of Major Osteoporotic and Hip Fractures Among Indonesian People Living with HIV
Authors: Iqbal Pramukti, Mamat Lukman, Hasniatisari Harun, Kusman Ibrahim
Abstract:
Introduction: People living with HIV had a higher risk of osteoporotic fracture than the general population. The purpose of this study was to predict the 10-year risk of fracture among people living with HIV (PLWH) using FRAX™ and to identify characteristics related to the fracture risk. Methodology: This study consisted of 75 subjects. The ten-year probability of major osteoporotic fractures (MOF) and hip fractures was assessed using the FRAX™ algorithm. A cross-tabulation was used to identify the participant’s characteristics related to fracture risk. Results: The overall mean 10-year probability of fracture was 2.4% (1.7) for MOF and 0.4% (0.3) for hip fractures. For MOF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use showed a higher MOF score than those who were not (3.1 vs. 2.5; 4.6 vs 2.5; and 3.4 vs 2.5, respectively). For HF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use also showed a higher HF score than those who were not (0.5 vs. 0.3; 0.8 vs. 0.3; and 0.5 vs. 0.3, respectively). Conclusions: The 10-year risk of fracture was higher among PLWH with several factors, including the parent’s hip. Fracture history, smoking behavior and glucocorticoid used. Further analysis on determining factors using multivariate regression analysis with a larger sample size is required to confirm the factors associated with the high fracture risk.Keywords: HIV, PLWH, osteoporotic fractures, hip fractures, 10-year risk of fracture, FRAX
Procedia PDF Downloads 482673 Co-Integration Model for Predicting Inflation Movement in Nigeria
Authors: Salako Rotimi, Oshungade Stephen, Ojewoye Opeyemi
Abstract:
The maintenance of price stability is one of the macroeconomic challenges facing Nigeria as a nation. This paper attempts to build a co-integration multivariate time series model for inflation movement in Nigeria using data extracted from the abstract of statistics of the Central Bank of Nigeria (CBN) from 2008 to 2017. The Johansen cointegration test suggests at least one co-integration vector describing the long run relationship between Consumer Price Index (CPI), Food Price Index (FPI) and Non-Food Price Index (NFPI). All three series show increasing pattern, which indicates a sign of non-stationary in each of the series. Furthermore, model predictability was established with root-mean-square-error, mean absolute error, mean average percentage error, and Theil’s unbiased statistics for n-step forecasting. The result depicts that the long run coefficient of a consumer price index (CPI) has a positive long-run relationship with the food price index (FPI) and non-food price index (NFPI).Keywords: economic, inflation, model, series
Procedia PDF Downloads 2422672 Factor Driving Consumer Intention in Online Shopping
Authors: Wanida Suwunniponth
Abstract:
The objectives of this research paper was to study the influencing factors that contributed the willingness of consumers to purchase products online included quality of website, perceived ease of use, perceived usefulness, trust on online purchases, attitude towards online shopping and intentions to online purchases. The research was conducted in both quantitative and qualitative methods, by utilizing both questionnaire and in-depth interview. A questionnaire was used to collect data from 350 consumers who had online shopping experiences in Bangkok, Thailand. Statistics utilized in this research included descriptive statistics and path analysis. The findings revealed that the factors concerning with quality of website, perceived ease of use and perceived usefulness played an influence on trust in online shopping. Trust also played an influence on attitude towards online purchase, whereas trust and attitude towards online purchase manipulated the intention of online purchase.Keywords: e-commerce, intention, online shopping, TAM, technological acceptance model
Procedia PDF Downloads 2572671 Influence of Local Soil Conditions on Optimal Load Factors for Seismic Design of Buildings
Authors: Miguel A. Orellana, Sonia E. Ruiz, Juan Bojórquez
Abstract:
Optimal load factors (dead, live and seismic) used for the design of buildings may be different, depending of the seismic ground motion characteristics to which they are subjected, which are closely related to the type of soil conditions where the structures are located. The influence of the type of soil on those load factors, is analyzed in the present study. A methodology that is useful for establishing optimal load factors that minimize the cost over the life cycle of the structure is employed; and as a restriction, it is established that the probability of structural failure must be less than or equal to a prescribed value. The life-cycle cost model used here includes different types of costs. The optimization methodology is applied to two groups of reinforced concrete buildings. One set (consisting on 4-, 7-, and 10-story buildings) is located on firm ground (with a dominant period Ts=0.5 s) and the other (consisting on 6-, 12-, and 16-story buildings) on soft soil (Ts=1.5 s) of Mexico City. Each group of buildings is designed using different combinations of load factors. The statistics of the maximums inter-story drifts (associated with the structural capacity) are found by means of incremental dynamic analyses. The buildings located on firm zone are analyzed under the action of 10 strong seismic records, and those on soft zone, under 13 strong ground motions. All the motions correspond to seismic subduction events with magnitudes M=6.9. Then, the structural damage and the expected total costs, corresponding to each group of buildings, are estimated. It is concluded that the optimal load factors combination is different for the design of buildings located on firm ground than that for buildings located on soft soil.Keywords: life-cycle cost, optimal load factors, reinforced concrete buildings, total costs, type of soil
Procedia PDF Downloads 3042670 Perspectives of Renewable Energy in 21st Century in India: Statistics and Estimation
Authors: Manoj Kumar, Rajesh Kumar
Abstract:
With the favourable geographical conditions at Indian-subcontinent, it is suitable for flourishing renewable energy. Increasing amount of dependence on coal and other conventional sources is driving the world into pollution and depletion of resources. This paper presents the statistics of energy consumption and energy generation in Indian Sub-continent, which notifies us with the increasing energy demands surpassing energy generation. With the aggrandizement in demand for energy, usage of coal has increased, since the major portion of energy production in India is from thermal power plants. The increase in usage of thermal power plants causes pollution and depletion of reserves; hence, a paradigm shift to renewable sources is inevitable. In this work, the capacity and potential of renewable sources in India are analyzed. Based on the analysis of this work, future potential of these sources is estimated.Keywords: depletion of reserves, energy consumption and generation, emmissions, global warming, renewable sources
Procedia PDF Downloads 4292669 Computational Methods in Official Statistics with an Example on Calculating and Predicting Diabetes Mellitus [DM] Prevalence in Different Age Groups within Australia in Future Years, in Light of the Aging Population
Authors: D. Hilton
Abstract:
An analysis of the Australian Diabetes Screening Study estimated undiagnosed diabetes mellitus [DM] prevalence in a high risk general practice based cohort. DM prevalence varied from 9.4% to 18.1% depending upon the diagnostic criteria utilised with age being a highly significant risk factor. Utilising the gold standard oral glucose tolerance test, the prevalence of DM was 22-23% in those aged >= 70 years and <15% in those aged 40-59 years. Opportunistic screening in Australian general practice potentially can identify many persons with undiagnosed type 2 DM. An Australian Bureau of Statistics document published three years ago, reported the highest rate of DM in men aged 65-74 years [19%] whereas the rate for women was highest in those over 75 years [13%]. If you consider that the Australian Bureau of Statistics report in 2007 found that 13% of the population was over 65 years of age and that this will increase to 23-25% by 2056 with a further projected increase to 25-28% by 2101, obviously this information has to be factored into the equation when age related diabetes prevalence predictions are calculated. This 10-15% proportional increase of elderly persons within the population demographics has dramatic implications for the estimated number of elderly persons with DM in these age groupings. Computational methodology showing the age related demographic changes reported in these official statistical documents will be done showing estimates for 2056 and 2101 for different age groups. This has relevance for future diabetes prevalence rates and shows that along with many countries worldwide Australia is facing an increasing pandemic. In contrast Japan is expected to have a decrease in the next twenty years in the number of persons with diabetes.Keywords: epidemiological methods, aging, prevalence, diabetes mellitus
Procedia PDF Downloads 3732668 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors
Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui
Abstract:
Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.Keywords: data-driven method, process control, anomaly detection, dimensionality reduction
Procedia PDF Downloads 2982667 The Effectiveness of ICT-Assisted PBL on College-Level Nano Knowledge and Learning Skills
Authors: Ya-Ting Carolyn Yang, Ping-Han Cheng, Shi-Hui Gilbert Chang, Terry Yuan-Fang Chen, Chih-Chieh Li
Abstract:
Nanotechnology is widely applied in various areas so professionals in the related fields have to know more than nano knowledge. In the study, we focus on adopting ICT-assisted PBL in college general education to foster professionals who possess multiple abilities. The research adopted a pretest and posttest quasi-experimental design. The control group received traditional instruction, and the experimental group received ICT-assisted PBL instruction. Descriptive statistics will be used to describe the means, standard deviations, and adjusted means for the tests between the two groups. Next, analysis of covariance (ANCOVA) will be used to compare the final results of the two research groups after 6 weeks of instruction. Statistics gathered in the end of the research can be used to make contrasts. Therefore, we will see how different teaching strategies can improve students’ understanding about nanotechnology and learning skills.Keywords: nanotechnology, science education, project-based learning, information and communication technology
Procedia PDF Downloads 3742666 Failure Probability Assessment of Concrete Spherical Domes Subjected to Ventilation Controlled Fires Using BIM Tools
Authors: A. T. Kassem
Abstract:
Fires areconsidered a common hazardous action that any building may face. Most buildings’ structural elements are designed, taking into consideration precautions for fire safety, using deterministic design approaches. Public and highly important buildings are commonly designed considering standard fire rating and, in many cases, contain large compartments with central domes. Real fire scenarios are not commonly brought into action in structural design of buildings because of complexities in both scenarios and analysis tools. This paper presents a modern approach towards analysis of spherical domes in real fire condition via implementation of building information modelling, and adopting a probabilistic approach. BIMhas been implemented to bridge the gap between various software packages enabling them to function interactively to model both real fire and corresponding structural response. Ventilation controlled fires scenarios have been modeled using both “Revit” and “Pyrosim”. Monte Carlo simulation has been adopted to engage the probabilistic analysis approach in dealing with various parameters. Conclusions regarding failure probability and fire endurance, in addition to the effects of various parameters, have been extracted.Keywords: concrete, spherical domes, ventilation controlled fires, BIM, monte carlo simulation, pyrosim, revit
Procedia PDF Downloads 932665 Impact of Violence against Women on Small and Medium Enterprises (SMEs) in Rural Sindh: A Case Study of Kandhkot
Authors: Mohammad Shoaib Khan, Abdul Sattar Bahalkani
Abstract:
This research investigates the violence and their impact on SMEs in Sindh. The main objective of current research is to examine the women empowerment through women participation in small and medium enterprises in upper Sindh. The data were collected from 500 respondents from Kandhkot District, by using simple random technique. A structural questionnaire was designed as an instrument for measuring the impact of SMEs business in women empowerment in rural Sindh. It was revealed that the rural women is less confident and their husbands were always given them hard time once they are exposing themselves to outside the boundaries of the house. It was revealed that rural women have a major contribution in social, economic, and political development. It was further revealed that women are getting low wages and due to non-availability of market facility they are paying low wages. The negative impact of husbands’ income and having children at the age of 0-6 years old are also significant. High income of other household member raises the reservation wage of mothers, thus lowers the probability of participation when the objective of working is to help family’s financial need. The impact of childcare on mothers’ labor force participation is significant but not as the theory predicted. The probability of participation in labor force is significantly higher for women who lived in the urban areas where job opportunities are greater compared to the rural.Keywords: empowerment, violence against women, SMEs, rural
Procedia PDF Downloads 3302664 Merging Appeal to Ignorance, Composition, and Division Argument Schemes with Bayesian Networks
Authors: Kong Ngai Pei
Abstract:
The argument scheme approach to argumentation has two components. One is to identify the recurrent patterns of inferences used in everyday discourse. The second is to devise critical questions to evaluate the inferences in these patterns. Although this approach is intuitive and contains many insightful ideas, it has been noted to be not free of problems. One is that due to its disavowing the probability calculus, it cannot give the exact strength of an inference. In order to tackle this problem, thereby paving the way to a more complete normative account of argument strength, it has been proposed, the most promising way is to combine the scheme-based approach with Bayesian networks (BNs). This paper pursues this line of thought, attempting to combine three common schemes, Appeal to Ignorance, Composition, and Division, with BNs. In the first part, it is argued that most (if not all) formulations of the critical questions corresponding to these schemes in the current argumentation literature are incomplete and not very informative. To remedy these flaws, more thorough and precise formulations of these questions are provided. In the second part, how to use graphical idioms (e.g. measurement and synthesis idioms) to translate the schemes as well as their corresponding critical questions to graphical structure of BNs, and how to define probability tables of the nodes using functions of various sorts are shown. In the final part, it is argued that many misuses of these schemes, traditionally called fallacies with the same names as the schemes, can indeed be adequately accounted for by the BN models proposed in this paper.Keywords: appeal to ignorance, argument schemes, Bayesian networks, composition, division
Procedia PDF Downloads 2852663 Computer Simulation of Hydrogen Superfluidity through Binary Mixing
Authors: Sea Hoon Lim
Abstract:
A superfluid is a fluid of bosons that flows without resistance. In order to be a superfluid, a substance’s particles must behave like bosons, yet remain mobile enough to be considered a superfluid. Bosons are low-temperature particles that can be in all energy states at the same time. If bosons were to be cooled down, then the particles will all try to be on the lowest energy state, which is called the Bose Einstein condensation. The temperature when bosons start to matter is when the temperature has reached its critical temperature. For example, when Helium reaches its critical temperature of 2.17K, the liquid density drops and becomes a superfluid with zero viscosity. However, most materials will solidify -and thus not remain fluids- at temperatures well above the temperature at which they would otherwise become a superfluid. Only a few substances currently known to man are capable of at once remaining a fluid and manifesting boson statistics. The most well-known of these is helium and its isotopes. Because hydrogen is lighter than helium, and thus expected to manifest Bose statistics at higher temperatures than helium, one might expect hydrogen to also be a superfluid. As of today, however, no one has yet been able to produce a bulk, hydrogen superfluid. The reason why hydrogen did not form a superfluid in the past is its intermolecular interactions. As a result, hydrogen molecules are much more likely to crystallize than their helium counterparts. The key to creating a hydrogen superfluid is therefore finding a way to reduce the effect of the interactions among hydrogen molecules, postponing the solidification to lower temperature. In this work, we attempt via computer simulation to produce bulk superfluid hydrogen through binary mixing. Binary mixture is a technique of mixing two pure substances in order to avoid crystallization and enhance super fluidity. Our mixture here is KALJ H2. We then sample the partition function using this Path Integral Monte Carlo (PIMC), which is well-suited for the equilibrium properties of low-temperature bosons and captures not only the statistics but also the dynamics of Hydrogen. Via this sampling, we will then produce a time evolution of the substance and see if it exhibits superfluid properties.Keywords: superfluidity, hydrogen, binary mixture, physics
Procedia PDF Downloads 3162662 Feasibility Study of Wind Energy Potential in Turkey: Case Study of Catalca District in Istanbul
Authors: Mohammed Wadi, Bedri Kekezoglu, Mustafa Baysal, Mehmet Rida Tur, Abdulfetah Shobole
Abstract:
This paper investigates the technical evaluation of the wind potential for present and future investments in Turkey taking into account the feasibility of sites, installments, operation, and maintenance. This evaluation based on the hourly measured wind speed data for the three years 2008–2010 at 30 m height for Çatalca district. These data were obtained from national meteorology station in Istanbul–Republic of Turkey are analyzed in order to evaluate the feasibility of wind power potential and to assure supreme assortment of wind turbines installing for the area of interest. Furthermore, the data are extrapolated and analyzed at 60 m and 80 m regarding the variability of roughness factor. Weibull bi-parameter probability function is used to approximate monthly and annually wind potential and power density based on three calculation methods namely, the approximated, the graphical and the energy pattern factor methods. The annual mean wind power densities were to be 400.31, 540.08 and 611.02 W/m² for 30, 60, and 80 m heights respectively. Simulation results prove that the analyzed area is an appropriate place for constructing large-scale wind farms.Keywords: wind potential in Turkey, Weibull bi-parameter probability function, the approximated method, the graphical method, the energy pattern factor method, capacity factor
Procedia PDF Downloads 2572661 Quality of Service of Transportation Networks: A Hybrid Measurement of Travel Time and Reliability
Authors: Chin-Chia Jane
Abstract:
In a transportation network, travel time refers to the transmission time from source node to destination node, whereas reliability refers to the probability of a successful connection from source node to destination node. With an increasing emphasis on quality of service (QoS), both performance indexes are significant in the design and analysis of transportation systems. In this work, we extend the well-known flow network model for transportation networks so that travel time and reliability are integrated into the QoS measurement simultaneously. In the extended model, in addition to the general arc capacities, each intermediate node has a time weight which is the travel time for per unit of commodity going through the node. Meanwhile, arcs and nodes are treated as binary random variables that switch between operation and failure with associated probabilities. For pre-specified travel time limitation and demand requirement, the QoS of a transportation network is the probability that source can successfully transport the demand requirement to destination while the total transmission time is under the travel time limitation. This work is pioneering, since existing literatures that evaluate travel time reliability via a single optimization path, the proposed QoS focuses the performance of the whole network system. To compute the QoS of transportation networks, we first transfer the extended network model into an equivalent min-cost max-flow network model. In the transferred network, each arc has a new travel time weight which takes value 0. Each intermediate node is replaced by two nodes u and v, and an arc directed from u to v. The newly generated nodes u and v are perfect nodes. The new direct arc has three weights: travel time, capacity, and operation probability. Then the universal set of state vectors is recursively decomposed into disjoint subsets of reliable, unreliable, and stochastic vectors until no stochastic vector is left. The decomposition is made possible by applying existing efficient min-cost max-flow algorithm. Because the reliable subsets are disjoint, QoS can be obtained directly by summing the probabilities of these reliable subsets. Computational experiments are conducted on a benchmark network which has 11 nodes and 21 arcs. Five travel time limitations and five demand requirements are set to compute the QoS value. To make a comparison, we test the exhaustive complete enumeration method. Computational results reveal the proposed algorithm is much more efficient than the complete enumeration method. In this work, a transportation network is analyzed by an extended flow network model where each arc has a fixed capacity, each intermediate node has a time weight, and both arcs and nodes are independent binary random variables. The quality of service of the transportation network is an integration of customer demands, travel time, and the probability of connection. We present a decomposition algorithm to compute the QoS efficiently. Computational experiments conducted on a prototype network show that the proposed algorithm is superior to existing complete enumeration methods.Keywords: quality of service, reliability, transportation network, travel time
Procedia PDF Downloads 2202660 Intergenerational Class Mobility in Greece: A Cross-Cohort Analysis with Evidence from European Union-Statistics on Income and Living Conditions
Authors: G. Stamatopoulou, M. Symeonaki, C. Michalopoulou
Abstract:
In this work, we study the intergenerational social mobility in Greece, in order to provide up-to-date evidence on the changes in the mobility patterns throughout the years. An analysis for both men and women aged between 25-64 years old is carried out. Three main research objectives are addressed. First, we aim to examine the relationship between the socio-economic status of parents and their children. Secondly, we investigate the evolution of the mobility patterns between different birth cohorts. Finally, the role of education is explored in shaping the mobility patterns. For the analysis, we draw data on both parental and individuals' social outcomes from different national databases. The social class of origins and destination is measured according to the European Socio-Economic Classification (ESeC), while the respondents' educational attainment is coded into categories based on the International Standard Classification of Education (ISCED). Applying the Markov transition probability theory, and a range of measures and models, this work focuses on the magnitude and the direction of the movements that take place in the Greek labour market, as well as the level of social fluidity. Three-way mobility tables are presented, where the transition probabilities between the classes of destination and origins are calculated for different cohorts. Additionally, a range of absolute and relative mobility rates, as well as distance measures, are presented. The study covers a large time span beginning in 1940 until 1995, shedding light on the effects of the national institutional processes on the social movements of individuals. Given the evidence on the mobility patterns of the most recent birth cohorts, we also investigate the possible effects of the 2008 economic crisis.Keywords: cohort analysis, education, Greece, intergenerational mobility, social class
Procedia PDF Downloads 1292659 Comparative Study to Evaluate the Efficacy of Control Criterion in Determining Consolidation Scope in the Public Sector
Authors: Batool Zarei
Abstract:
This study aims to answer this question whether control criterion with two elements of power and benefit which is introduced as 'control criterion of consolidation scope' in national and international standards of accounting in public sector (and also private sector) is efficient enough or not. The methodology of this study is comparative and the results of this research are significantly generalizable, due to the given importance to the sample of countries which were studied. Findings of this study states that in spite of pervasive use of control criterion (including 2 elements of power and benefit), criteria for determining the existence of control in public sector accounting standards, are not efficient enough to determine the consolidation scope of whole of government financial statements in a way that meet decision making and accountability needs of managers, policy makers and supervisors; specially parliament. Therefore, the researcher believes that for determining consolidation scope in public sector, in addition to economic view, it is better to pay attention to budgetary, legal and statistical concepts and also to practical and financial risk and define indicators for proving the existence of control (power and benefit) which include accountability relationships (budgetary relation, legal form and nature of activity). these findings also reveals the necessity of passing a comprehensive public financial management (PFM) legislation in order to redefine the characteristics of public sector entities and whole of government financial statements scope and review Statistics organizations and central banks duties for preparing government financial statistics and national accounts in order to achieve sustainable development and resilient economy goals.Keywords: control, consolidation scope, public sector accounting, government financial statistics, resilient economy
Procedia PDF Downloads 2582658 Failure Statistics Analysis of China’s Spacecraft in Full-Life
Authors: Xin-Yan Ji
Abstract:
The historical failures data of the spacecraft is very useful to improve the spacecraft design and the test philosophies and reduce the spacecraft flight risk. A study of spacecraft failures data was performed, which is the most comprehensive statistics of spacecrafts in China. 2593 on-orbit failures data and 1298 ground data that occurred on 150 spacecraft launched from 2000 to 2016 were identified and collected, which covered the navigation satellites, communication satellites, remote sensing deep space exploration manned spaceflight platforms. In this paper, the failures were analyzed to compare different spacecraft subsystem and estimate their impact on the mission, then the development of spacecraft in China was evaluated from design, software, workmanship, management, parts, and materials. Finally, the lessons learned from the past years show that electrical and mechanical failures are responsible for the largest parts, and the key solution to reduce in-orbit failures is improving design technology, enough redundancy, adequate space environment protection measures, and adequate ground testing.Keywords: spacecraft anomalies, anomalies mechanism, failure cause, spacecraft testing
Procedia PDF Downloads 1152657 A Study on the Readers' Motivation and Satisfaction on Sports Newspaper in Vietnam
Authors: Trang Huyen Nguyen, Thien Tri Huynh
Abstract:
The objectives of this paper were to determine demographics of readers at Hochiminh city (HCMC), study reading motivation which affected citizens to read sports newspapers and measure readers’ satisfaction on issues related sports newspapers. Subjects of this survey were HCMC’s citizens. After collecting data, there were 568 useful feedbacks (the rate of response was 94.7%). The data analysis in the study included descriptive statistics and inferred statistics by SPSS 16.0 program for the research questions. The majority of respondents were male, from 24 to 32 years old, got the first degree and earned monthly from $US 150 to 300. Moreover, they were government officials and read newspaper from 11 to 20 times per month, bought newspapers by themselves. Finding information to predict results of sports matches was the highest motive affected readers; and the diversity information was the most pleasure that readers felt about sports newspapers. According to research findings, the board of editors could use the worthy information to make a strategic plan for newspaper on contents as well as design to meet the increasing demands of readers.Keywords: motivation, satisfaction, readers, sports newspapers
Procedia PDF Downloads 3022656 Examining the Relationship between Chi-Square Test Statistics and Skewness of Weibull Distribution: Simulation Study
Authors: Rafida M. Elobaid
Abstract:
Most of the literature on goodness-of-fit test try to provide a theoretical basis for studying empirical distribution functions. Such goodness-of-fit tests are Kolmogorove-Simirnov and Crumer-Von Mises Type tests. However, it is likely that most of literature has not focused in details on the relationship of the values of the test statistics and skewness or kurtosis. The aim of this study is to investigate the behavior of the values of the χ2 test statistic with the variation of the skewness of right skewed distribution. A simulation study is conducted to generate random numbers from Weibull distribution. For a fixed sample sizes, different levels of skewness are considered, and the corresponding values of the χ2 test statistic are calculated. Using different sample sizes, the results show an inverse relationship between the value of χ2 test and the level of skewness for Wiebull distribution, i.e the value of χ2 test statistic decreases as the value of skewness increases. The research results also show that with large values of skewness we are more confident that the data follows the assumed distribution. Nonparametric Kendall τ test is used to confirm these results.Keywords: goodness-of-fit test, chi-square test, simulation, continuous right skewed distributions
Procedia PDF Downloads 4202655 An Exploratory Sequential Design: A Mixed Methods Model for the Statistics Learning Assessment with a Bayesian Network Representation
Authors: Zhidong Zhang
Abstract:
This study established a mixed method model in assessing statistics learning with Bayesian network models. There are three variants in exploratory sequential designs. There are three linked steps in one of the designs: qualitative data collection and analysis, quantitative measure, instrument, intervention, and quantitative data collection analysis. The study used a scoring model of analysis of variance (ANOVA) as a content domain. The research study is to examine students’ learning in both semantic and performance aspects at fine grain level. The ANOVA score model, y = α+ βx1 + γx1+ ε, as a cognitive task to collect data during the student learning process. When the learning processes were decomposed into multiple steps in both semantic and performance aspects, a hierarchical Bayesian network was established. This is a theory-driven process. The hierarchical structure was gained based on qualitative cognitive analysis. The data from students’ ANOVA score model learning was used to give evidence to the hierarchical Bayesian network model from the evidential variables. Finally, the assessment results of students’ ANOVA score model learning were reported. Briefly, this was a mixed method research design applied to statistics learning assessment. The mixed methods designs expanded more possibilities for researchers to establish advanced quantitative models initially with a theory-driven qualitative mode.Keywords: exploratory sequential design, ANOVA score model, Bayesian network model, mixed methods research design, cognitive analysis
Procedia PDF Downloads 1772654 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes
Authors: V. Churkin, M. Lopatin
Abstract:
The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second –95,3%.Keywords: bass model, generalized bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States
Procedia PDF Downloads 3462653 Performance Evaluation of a Prioritized, Limited Multi-Server Processor-Sharing System that Includes Servers with Various Capacities
Authors: Yoshiaki Shikata, Nobutane Hanayama
Abstract:
We present a prioritized, limited multi-server processor sharing (PS) system where each server has various capacities, and N (≥2) priority classes are allowed in each PS server. In each prioritized, limited server, different service ratio is assigned to each class request, and the number of requests to be processed is limited to less than a certain number. Routing strategies of such prioritized, limited multi-server PS systems that take into account the capacity of each server are also presented, and a performance evaluation procedure for these strategies is discussed. Practical performance measures of these strategies, such as loss probability, mean waiting time, and mean sojourn time, are evaluated via simulation. In the PS server, at the arrival (or departure) of a request, the extension (shortening) of the remaining sojourn time of each request receiving service can be calculated by using the number of requests of each class and the priority ratio. Utilising a simulation program which executes these events and calculations, the performance of the proposed prioritized, limited multi-server PS rule can be analyzed. From the evaluation results, most suitable routing strategy for the loss or waiting system is clarified.Keywords: processor sharing, multi-server, various capacity, N-priority classes, routing strategy, loss probability, mean sojourn time, mean waiting time, simulation
Procedia PDF Downloads 3302652 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment
Authors: Isabela Moreira Queiroz
Abstract:
Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management.Keywords: probabilistic methods, risk assessment, risk management, slope stability
Procedia PDF Downloads 3892651 Timing and Probability of Presurgical Teledermatology: Survival Analysis
Authors: Felipa de Mello-Sampayo
Abstract:
The aim of this study is to undertake, from patient’s perspective, the timing and probability of using teledermatology, comparing it with a conventional referral system. The dynamic stochastic model’s main value-added consists of the concrete application to patients waiting for dermatology surgical intervention. Patients with low health level uncertainty must use teledermatology treatment as soon as possible, which is precisely when the teledermatology is least valuable. The results of the model were then tested empirically with the teledermatology network covering the area served by the Hospital Garcia da Horta, Portugal, links the primary care centers of 24 health districts with the hospital’s dermatology department via the corporate intranet of the Portuguese healthcare system. Health level volatility can be understood as the hazard of developing skin cancer and the trend of health level as the bias of developing skin lesions. The results of the survival analysis suggest that the theoretical model can explain the use of teledermatology. It depends negatively on the volatility of patients' health, and positively on the trend of health, i.e., the lower the risk of developing skin cancer and the younger the patients, the more presurgical teledermatology one expects to occur. Presurgical teledermatology also depends positively on out-of-pocket expenses and negatively on the opportunity costs of teledermatology, i.e., the lower the benefit missed by using teledermatology, the more presurgical teledermatology one expects to occur.Keywords: teledermatology, wait time, uncertainty, opportunity cost, survival analysis
Procedia PDF Downloads 126