Search results for: probability and statistics
2761 Effect of Atmospheric Turbulence on Hybrid FSO/RF Link Availability under Qatar's Harsh Climate
Authors: Abir Touati, Syed Jawad Hussain, Farid Touati, Ammar Bouallegue
Abstract:
Although there has been a growing interest in the hybrid free-space optical link and radio frequency FSO/RF communication system, the current literature is limited to results obtained in moderate or cold environment. In this paper, using a soft switching approach, we investigate the effect of weather inhomogeneities on the strength of turbulence hence the channel refractive index under Qatar harsh environment and their influence on the hybrid FSO/RF availability. In this approach, either FSO/RF or simultaneous or none of them can be active. Based on soft switching approach and a finite state Markov Chain (FSMC) process, we model the channel fading for the two links and derive a mathematical expression for the outage probability of the hybrid system. Then, we evaluate the behavior of the hybrid FSO/RF under hazy and harsh weather. Results show that the FSO/RF soft switching renders the system outage probability less than that of each link individually. A soft switching algorithm is being implemented on FPGAs using Raptor code interfaced to the two terminals of a 1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented in the region. Experimental results are compared to the above simulation results.Keywords: atmospheric turbulence, haze, hybrid FSO/RF, outage probability, refractive index
Procedia PDF Downloads 3922760 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance
Authors: Clement Yeboah, Eva Laryea
Abstract:
A pretest-posttest within subjects experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant, indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant, indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop an interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers and will continue to be a dynamic and rapidly evolving field for years to come.Keywords: pretest-posttest within subjects, computer game-based learning, statistics achievement, statistics anxiety
Procedia PDF Downloads 522759 Secrecy Analysis in Downlink Cellular Networks in the Presence of D2D Pairs and Hardware Impairment
Authors: Mahdi Rahimi, Mohammad Mahdi Mojahedian, Mohammad Reza Aref
Abstract:
In this paper, a cellular communication scenario with a transmitter and an authorized user is considered to analyze its secrecy in the face of eavesdroppers and the interferences propagated unintentionally through the communication network. It is also assumed that some D2D pairs and eavesdroppers are randomly located in the cell. Assuming hardware impairment, perfect connection probability is analytically calculated, and upper bound is provided for the secrecy outage probability. In addition, a method based on random activation of D2Ds is proposed to improve network security. Finally, the analytical results are verified by simulations.Keywords: physical layer security, stochastic geometry, device-to-device, hardware impairment
Procedia PDF Downloads 1422758 Traffic Safety and Risk Assessment Model by Analysis of Questionnaire Survey: A Case Study of S. G. Highway, Ahmedabad, India
Authors: Abhijitsinh Gohil, Kaushal Wadhvaniya, Kuldipsinh Jadeja
Abstract:
Road Safety is a multi-sectoral and multi-dimensional issue. An effective model can assess the risk associated with highway safety. A questionnaire survey is very essential to identify the events or activities which are causing unsafe condition for traffic on an urban highway. A questionnaire of standard questions including vehicular, human and infrastructure characteristics can be made. Responses from the age wise group of road users can be taken on field. Each question or an event holds a specific risk weightage, which contributes in creating an inappropriate and unsafe flow of traffic. The probability of occurrence of an event can be calculated from the data collected from the road users. Finally, the risk score can be calculated by considering the risk factor and the probability of occurrence of individual event and addition of all risk score for the individual event will give the total risk score of a particular road. Standards for risk score can be made and total risk score can be compared with the standards. Thus road can be categorized based on risk associated and traffic safety on it. With this model, one can assess the need for traffic safety improvement on a given road, and qualitative data can be analysed.Keywords: probability of occurrence, questionnaire, risk factor, risk score
Procedia PDF Downloads 3152757 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics
Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane
Abstract:
Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing
Procedia PDF Downloads 3912756 Consumer Attitude and Purchase Intention towards Organic Food: Insights from Pakistan
Authors: Muneshia Maheshwar, Kanwal Gul, Shakira Fareed, Ume-Amama Areeb Gul
Abstract:
Organic food is commonly known for its healthier content without the use of pesticides, herbicides, inorganic fertilizers, antibiotics and growth hormones. The aim of this research is to examine the effect of health consciousness, environmental concern and organic food knowledge on both the intention to buy organic foods and the attitude towards organic foods and the effect of attitude towards organic foods on the intention to buy organic foods in Pakistan. Primary data was used which was collected through adopted questionnaire from previous research. Non- probability convenience sampling was used to select sample size of 200 consumers based on Karachi. The data was analyzed through Descriptive statistics and Multi regression method. The findings of the study showed that the attitude and the intention to buy organic food were affected by health consciousness, environmental concern, and organic food knowledge. The results also revealed that attitude also affects the intention to buy organic food.Keywords: health consciousness, attitude, intention to purchase, environmental concern, organic food knowledge
Procedia PDF Downloads 2152755 Probabilistic Analysis of Bearing Capacity of Isolated Footing using Monte Carlo Simulation
Authors: Sameer Jung Karki, Gokhan Saygili
Abstract:
The allowable bearing capacity of foundation systems is determined by applying a factor of safety to the ultimate bearing capacity. Conventional ultimate bearing capacity calculations routines are based on deterministic input parameters where the nonuniformity and inhomogeneity of soil and site properties are not accounted for. Hence, the laws of mathematics like probability calculus and statistical analysis cannot be directly applied to foundation engineering. It’s assumed that the Factor of Safety, typically as high as 3.0, incorporates the uncertainty of the input parameters. This factor of safety is estimated based on subjective judgement rather than objective facts. It is an ambiguous term. Hence, a probabilistic analysis of the bearing capacity of an isolated footing on a clayey soil is carried out by using the Monte Carlo Simulation method. This simulated model was compared with the traditional discrete model. It was found out that the bearing capacity of soil was found higher for the simulated model compared with the discrete model. This was verified by doing the sensitivity analysis. As the number of simulations was increased, there was a significant % increase of the bearing capacity compared with discrete bearing capacity. The bearing capacity values obtained by simulation was found to follow a normal distribution. While using the traditional value of Factor of safety 3, the allowable bearing capacity had lower probability (0.03717) of occurring in the field compared to a higher probability (0.15866), while using the simulation derived factor of safety of 1.5. This means the traditional factor of safety is giving us bearing capacity that is less likely occurring/available in the field. This shows the subjective nature of factor of safety, and hence probability method is suggested to address the variability of the input parameters in bearing capacity equations.Keywords: bearing capacity, factor of safety, isolated footing, montecarlo simulation
Procedia PDF Downloads 1552754 Bernstein Type Polynomials for Solving Differential Equations and Their Applications
Authors: Yilmaz Simsek
Abstract:
In this paper, we study the Bernstein-type basis functions with their generating functions. We give various properties of these polynomials with the aid of their generating functions. These polynomials and generating functions have many valuable applications in mathematics, in probability, in statistics and also in mathematical physics. By using the Bernstein-Galerkin and the Bernstein-Petrov-Galerkin methods, we give some applications of the Bernstein-type polynomials for solving high even-order differential equations with their numerical computations. We also give Bezier-type curves related to the Bernstein-type basis functions. We investigate fundamental properties of these curves. These curves have many applications in mathematics, in computer geometric design and other related areas. Moreover, we simulate these polynomials with their plots for some selected numerical values.Keywords: generating functions, Bernstein basis functions, Bernstein polynomials, Bezier curves, differential equations
Procedia PDF Downloads 2392753 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution
Procedia PDF Downloads 3472752 Frequency Analysis Using Multiple Parameter Probability Distributions for Rainfall to Determine Suitable Probability Distribution in Pakistan
Authors: Tasir Khan, Yejuan Wang
Abstract:
The study of extreme rainfall events is very important for flood management in river basins and the design of water conservancy infrastructure. Evaluation of quantiles of annual maximum rainfall (AMRF) is required in different environmental fields, agriculture operations, renewable energy sources, climatology, and the design of different structures. Therefore, the annual maximum rainfall (AMRF) was performed at different stations in Pakistan. Multiple probability distributions, log normal (LN), generalized extreme value (GEV), Gumbel (max), and Pearson type3 (P3) were used to find out the most appropriate distributions in different stations. The L moments method was used to evaluate the distribution parameters. Anderson darling test, Kolmogorov- Smirnov test, and chi-square test showed that two distributions, namely GUM (max) and LN, were the best appropriate distributions. The quantile estimate of a multi-parameter PD offers extreme rainfall through a specific location and is therefore important for decision-makers and planners who design and construct different structures. This result provides an indication of these multi-parameter distribution consequences for the study of sites and peak flow prediction and the design of hydrological maps. Therefore, this discovery can support hydraulic structure and flood management.Keywords: RAMSE, multiple frequency analysis, annual maximum rainfall, L-moments
Procedia PDF Downloads 532751 Approximate Confidence Interval for Effect Size Base on Bootstrap Resampling Method
Authors: S. Phanyaem
Abstract:
This paper presents the confidence intervals for the effect size base on bootstrap resampling method. The meta-analytic confidence interval for effect size is proposed that are easy to compute. A Monte Carlo simulation study was conducted to compare the performance of the proposed confidence intervals with the existing confidence intervals. The best confidence interval method will have a coverage probability close to 0.95. Simulation results have shown that our proposed confidence intervals perform well in terms of coverage probability and expected length.Keywords: effect size, confidence interval, bootstrap method, resampling
Procedia PDF Downloads 5672750 The Tourist Satisfaction on Logo Design of Huay Kon Border Market, Chaloemphrakiat District, Nan Province
Authors: Panupong Chanplin, Wilailuk Mepracha, Sathapath Kilaso
Abstract:
The aims of this research were twofold: 1) to logo design of Huay Kon Border Market, Chaloemphrakiat District, Nan Province and 2) to study the level of tourist satisfaction towards logo design of Huay Kon Border Market, Chaloemphrakiat District, Nan Province. Tourist satisfaction was measured using four criteria: a unique product identity, ease of remembrance, product utility, and beauty/impressiveness. The researcher utilized a probability sampling method via simple random sampling. The sample consisted of 30 tourists in the Huay Kon Border Market. Statistics utilized for data analysis were percentage, mean, and standard deviation. The results suggest that tourist had high levels of satisfaction towards all four criteria of the logo design that was designed to target them. This study proposes that specifically logo designed of Huay Kon Border Market could also be implemented with other real media already available on the market.Keywords: satisfaction, logo, design, Huay Kon border market
Procedia PDF Downloads 1992749 Invalidation of the Start of Lunar Calendars Based on Sighting of Crescent: A Survey of 101 Years of Data between 1938 and 2038
Authors: Rafik Ouared
Abstract:
The purpose of this paper is to invalidate decisions made by the Islamic conference led at Istanbul in 2016, which had defined two basic criteria to determine the start of the lunar month: (1)they are all based on the sighting of the crescent, be it observed or computed with modern methods, and (2) they've strongly recommended the adoption of the principle of 'unification of sighting', by which any occurrence of sighting anywhere would be applicable everywhere. To demonstrate the invalidation of those statements, a survey of 101 years of data, from 1938 to 2038, have been analyzed to compare the probability density function (PDF) of time difference between different types of fajr and new moon. Two groups of fajr have been considered: the 'natural fajr', which is the very first fajr following new moon, and the 'biased fajr', which is defined by human being inclusively of all chosen definitions. The parametric and non-parametric statistical comparisons between the different groups have shown the all the biased PDFs are significantly different from the unbiased (natural) PDF with probability value (p-value) less than 0.001. The significance level was fixed to 0.05. Conclusion: the on-going reference to sighting of crescent is inducing an significant bias in defining lunar calendar. Therefore, 'natural' calendar would be more applicable requiring a more contextualized revision of issue in fiqh.Keywords: biased fajr, lunar calendar, natural fajr, probability density function, sighting of crescent, time difference between fajr and new moon
Procedia PDF Downloads 1842748 An Analysis of the Effect of Sharia Financing and Work Relation Founding towards Non-Performing Financing in Islamic Banks in Indonesia
Authors: Muhammad Bahrul Ilmi
Abstract:
The purpose of this research is to analyze the influence of Islamic financing and work relation founding simultaneously and partially towards non-performing financing in Islamic banks. This research was regression quantitative field research, and had been done in Muammalat Indonesia Bank and Islamic Danamon Bank in 3 months. The populations of this research were 15 account officers of Muammalat Indonesia Bank and Islamic Danamon Bank in Surakarta, Indonesia. The techniques of collecting data used in this research were documentation, questionnaire, literary study and interview. Regression analysis result shows that Islamic financing and work relation founding simultaneously has positive and significant effect towards non performing financing of two Islamic Banks. It is obtained with probability value 0.003 which is less than 0.05 and F value 9.584. The analysis result of Islamic financing regression towards non performing financing shows the significant effect. It is supported by double linear regression analysis with probability value 0.001 which is less than 0.05. The regression analysis of work relation founding effect towards non-performing financing shows insignificant effect. This is shown in the double linear regression analysis with probability value 0.161 which is bigger than 0.05.Keywords: Syariah financing, work relation founding, non-performing financing (NPF), Islamic Bank
Procedia PDF Downloads 4042747 Estimation of Location and Scale Parameters of Extended Exponential Distribution Based on Record Statistics
Authors: E. Krishna
Abstract:
An Extended form of exponential distribution using Marshall and Olkin method is introduced.The location scale family of these distributions is considered. For location scale free family, exact expressions for single and product moments of upper record statistics are derived. The mean, variance and covariance of record values are computed for various values of the shape parameter. Using these the BLUE's of location and scale parameters are derived.The variances and covariance of estimates are obtained.Through Monte Carlo simulation the condence intervals for location and scale parameters are constructed.The Best liner unbiased Predictor (BLUP) of future records are also discussed.Keywords: BLUE, BLUP, condence interval, Marshall-Olkin distribution, Monte Carlo simulation, prediction of future records, record statistics
Procedia PDF Downloads 3942746 Finite Difference Based Probabilistic Analysis to Evaluate the Impact of Correlation Length on Long-Term Settlement of Soft Soils
Authors: Mehrnaz Alibeikloo, Hadi Khabbaz, Behzad Fatahi
Abstract:
Probabilistic analysis has become one of the most popular methods to quantify and manage geotechnical risks due to the spatial variability of soil input parameters. The correlation length is one of the key factors of quantifying spatial variability of soil parameters which is defined as a distance within which the random variables are correlated strongly. This paper aims to assess the impact of correlation length on the long-term settlement of soft soils improved with preloading. The concept of 'worst-case' spatial correlation length was evaluated by determining the probability of failure of a real case study of Vasby test fill. For this purpose, a finite difference code was developed based on axisymmetric consolidation equations incorporating the non-linear elastic visco-plastic model and the Karhunen-Loeve expansion method. The results show that correlation length has a significant impact on the post-construction settlement of soft soils in a way that by increasing correlation length, probability of failure increases and the approach to asymptote.Keywords: Karhunen-Loeve expansion, probability of failure, soft soil settlement, 'worst case' spatial correlation length
Procedia PDF Downloads 1372745 A Knowledge-Based Development of Risk Management Approaches for Construction Projects
Authors: Masoud Ghahvechi Pour
Abstract:
Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.Keywords: risk, management, knowledge, risk management
Procedia PDF Downloads 302744 Congestion Control in Mobile Network by Prioritizing Handoff Calls
Authors: O. A. Lawal, O. A Ojesanmi
Abstract:
The demand for wireless cellular services continues to increase while the radio resources remain limited. Thus, network operators have to continuously manage the scarce radio resources in order to have an improved quality of service for mobile users. This paper proposes how to handle the problem of congestion in the mobile network by prioritizing handoff call, using the guard channel allocation scheme. The research uses specific threshold value for the time of allocation of the channel in the algorithm. The scheme would be simulated by generating various data for different traffics in the network as it would be in the real life. The result would be used to determine the probability of handoff call dropping and the probability of the new call blocking as a way of measuring the network performance.Keywords: call block, channel, handoff, mobile cellular network
Procedia PDF Downloads 3682743 Comparison of Wind Fragility for Window System in the Simplified 10 and 15-Story Building Considering Exposure Category
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Window system in high rise building is occasionally subjected to an excessive wind intensity, particularly during typhoon. The failure of window system did not affect overall safety of structural performance; however, it could endanger the safety of the residents. In this paper, comparison of fragility curves for window system of two residential buildings was studied. The probability of failure for individual window was determined with Monte Carlo Simulation method. Then, lognormal cumulative distribution function was used to represent the fragility. The results showed that windows located on the edge of leeward wall were more susceptible to wind load and the probability of failure for each window panel increased at higher floors.Keywords: wind fragility, window system, high rise building, wind disaster
Procedia PDF Downloads 2922742 Sufficient Conditions for Exponential Stability of Stochastic Differential Equations with Non Trivial Solutions
Authors: Fakhreddin Abedi, Wah June Leong
Abstract:
Exponential stability of stochastic differential equations with non trivial solutions is provided in terms of Lyapunov functions. The main result of this paper establishes that, under certain hypotheses for the dynamics f(.) and g(.), practical exponential stability in probability at the small neighborhood of the origin is equivalent to the existence of an appropriate Lyapunov function. Indeed, we establish exponential stability of stochastic differential equation when almost all the state trajectories are bounded and approach a sufficiently small neighborhood of the origin. We derive sufficient conditions for exponential stability of stochastic differential equations. Finally, we give a numerical example illustrating our results.Keywords: exponential stability in probability, stochastic differential equations, Lyapunov technique, Ito's formula
Procedia PDF Downloads 242741 Comparison of Conjunctival Autograft versus Amniotic Membrane Transplantation for Pterygium Surgery
Authors: Luksanaporn Krungkraipetch
Abstract:
Currently, surgery is the only known effective treatment for pterygium. In certain groups, the probability of recurrence after basic sclera excision is very significant. Tissue grafting is substantially more time-consuming and challenging than keeping the sclera uncovered, but it reduces the chance of recurrence. Conjunctival autograft surgery is older than amniotic membrane graft surgery. The purpose of this study was to compare pterygium surgery with conjunctival autograft against an amniotic membrane transplant. In the study, a randomized controlled trial was used. Four cases were ruled out (two for failing to meet inclusion criteria and the other for refusing to participate). Group I (n = 40) received the intervention, whereas Group II (n = 40) served as the control. Both descriptive and inferential statistical approaches were used, including data analysis and data analysis statistics. The descriptive statistics analysis covered basic pterygium surgery information as well as the risk of recurrent pterygium. As an inferential statistic, the chi-square was used. A p-value of 0.05 is statistically significant. The findings of this investigation were the majority of patients in Group I were female (70.0%), aged 41–60 years, had no underlying disease (95.0%), and had nasal pterygium (97.5%). The majority of Group II patients were female (60.0%), aged 41–60 years, had no underlying disease (97.5%) and had nasal pterygium (97.5%). Group I had no recurrence of pterygium after surgery, but Group II had a 7.5% recurrence rate. Typically, the recurrence time is twelve months. The majority of pterygium recurrences occur in females (83.3%), between the ages of 41 and 60 (66.7%), with no underlying disease. The recurrence period is typically six months (60%) and a nasal pterygium site (83.3%). Pterygium recurrence after surgery is associated with nasal location (p =.002). 16.7% of pterygium surgeries result in complications; one woman with nasal pterygium underwent autograft surgery six months later. The presence of granulation tissue at the surgical site is a mild complication. A pterygium surgery recurrence rate comparison of conjunctival autograft and amniotic membrane transplantation revealed that conjunctival autograft had a higher recurrence rate than amniotic membrane transplantation (p =.013).Keywords: pterygium, pterygium surgery, conjunctival autograft, amniotic membrane transplantation
Procedia PDF Downloads 432740 Producing Outdoor Design Conditions based on the Dependency between Meteorological Elements: Copula Approach
Authors: Zhichao Jiao, Craig Farnham, Jihui Yuan, Kazuo Emura
Abstract:
It is common to use the outdoor design weather data to select the air-conditioning capacity in the building design stage. The outdoor design weather data are usually comprised of multiple meteorological elements for a 24-hour period separately, but the dependency between the elements is not well considered, which may cause an overestimation of selecting air-conditioning capacity. Considering the dependency between the air temperature and global solar radiation, we used the copula approach to model the joint distributions of those two weather elements and suggest a new method of selecting more credible outdoor design conditions based on the specific simultaneous occurrence probability of air temperature and global solar radiation. In this paper, the 10-year period hourly weather data from 2001 to 2010 in Osaka, Japan, was used to analyze the dependency structure and joint distribution, the result shows that the Joe-Frank copula fit for almost all hourly data. According to calculating the simultaneous occurrence probability and the common exceeding probability of air temperature and global solar radiation, the results have shown that the maximum difference in design air temperature and global solar radiation of the day is about 2 degrees Celsius and 30W/m2, respectively.Keywords: energy conservation, design weather database, HVAC, copula approach
Procedia PDF Downloads 2222739 Flexural Fatigue Performance of Self-Compacting Fibre Reinforced Concrete
Authors: Surinder Pal Singh, Sanjay Goel
Abstract:
The paper presents results of an investigation conducted to study the flexural fatigue characteristics of Self Compacting Concrete (SCC) and Self Compacting Fibre Reinforced Concrete (SCFRC). In total 360 flexural fatigue tests and 270 static flexural strength tests were conducted on SCC and SCFRC specimens to obtain the fatigue test data. The variability in the distribution of fatigue life of SCC and SCFRC have been analyzed and compared with that of NVC and NVFRC containing steel fibres of comparable size and shape. The experimental coefficients of fatigue equations have been estimated to represent relationship between stress level (S) and fatigue life (N) for SCC and SCFRC containing different fibre volume fractions. The probability of failure (Pf) has been incorporated in S-N relationships to obtain families of S-N-Pf relationships. A good agreement between the predicted curves and those obtained from the test data has been observed. The fatigue performance of SCC and SCFRC has been evaluated in terms of two-million cycles fatigue strength/endurance limit. The theoretic fatigue lives were also estimated using single-log fatigue equation for 10% probability of failure to estimate the enhanced extent of theoretic fatigue lives of SCFRC with reference to SCC and NVC. The reduction in variability in the fatigue life, increased endurance limit and increased theoretiac fatigue lives demonstrates an overall better fatigue performance for SCC and SCFRC.Keywords: fatigue life, fibre, probability of failure, self-compacting concrete
Procedia PDF Downloads 3292738 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.Keywords: wind fragility, glass window, high rise building, wind disaster
Procedia PDF Downloads 2352737 Robust Noisy Speech Identification Using Frame Classifier Derived Features
Authors: Punnoose A. K.
Abstract:
This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering
Procedia PDF Downloads 972736 Optimal Scheduling for Energy Storage System Considering Reliability Constraints
Authors: Wook-Won Kim, Je-Seok Shin, Jin-O Kim
Abstract:
This paper propose the method for optimal scheduling for battery energy storage system with reliability constraint of energy storage system in reliability aspect. The optimal scheduling problem is solved by dynamic programming with proposed transition matrix. Proposed optimal scheduling method guarantees the minimum fuel cost within specific reliability constraint. For evaluating proposed method, the timely capacity outage probability table (COPT) is used that is calculated by convolution of probability mass function of each generator. This study shows the result of optimal schedule of energy storage system.Keywords: energy storage system (ESS), optimal scheduling, dynamic programming, reliability constraints
Procedia PDF Downloads 3762735 Predicting the Uniaxial Strength Distribution of Brittle Materials Based on a Uniaxial Test
Authors: Benjamin Sonnenreich
Abstract:
Brittle fracture failure probability is best described using a stochastic approach which is based on the 'weakest link concept' and the connection between a microstructure and macroscopic fracture scale. A general theoretical and experimental framework is presented to predict the uniaxial strength distribution according to independent uniaxial test data. The framework takes as input the applied stresses, the geometry, the materials, the defect distributions and the relevant random variables from uniaxial test results and gives as output an overall failure probability that can be used to improve the reliability of practical designs. Additionally, the method facilitates comparisons of strength data from several sources, uniaxial tests, and sample geometries.Keywords: brittle fracture, strength distribution, uniaxial, weakest link concept
Procedia PDF Downloads 2972734 Examining Motivational Strategies of Foreign Manufacturing Firms in Ghana
Authors: Samuel Ato Dadzie
Abstract:
The objective of this study is to examine the influence of eclectic paradigm on motivational strategy of foreign subsidiaries in Ghana. This study uses binary regression model, and the analysis was based on 75 manufacturing investments made by MNEs from different countries in 1994–2008. The results indicated that perceived market size increases the probability of foreign firms undertaking a market seeking (MS) in Ghana, while perceived cultural distance between Ghana and foreign firm’s home countries decreased the probability of foreign firms undertaking an market seeking (MS) foreign direct investment (FDI) in Ghana. Furthermore, extensive international experience decreases the probability of foreign firms undertaking a market seeking (MS) foreign direct investment (FDI) in Ghana. Most of the studies done by earlier researchers were based on the advanced and emerging countries and offered support for the theory, which was used in generalizing the result that multinational corporations (MNCs) normally used the theory regarding investment strategy outside their home country. In using the same theory in the context of Ghana, the result does not offer strong support for the theory. This means that MNCs that come to Sub-Sahara Africa cannot rely much on eclectic paradigm for their motivational strategies because prevailing economic conditions in Ghana are different from that of the advanced and emerging economies where the institutional structures work.Keywords: foreign subsidiary, motives, Ghana, foreign direct investment
Procedia PDF Downloads 4052733 Using the Bootstrap for Problems Statistics
Authors: Brahim Boukabcha, Amar Rebbouh
Abstract:
The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models
Procedia PDF Downloads 3552732 Dimensioning of Circuit Switched Networks by Using Simulation Code Based On Erlang (B) Formula
Authors: Ali Mustafa Elshawesh, Mohamed Abdulali
Abstract:
The paper presents an approach to dimension circuit switched networks and find the relationship between the parameters of the circuit switched networks on the condition of specific probability of call blocking. Our work is creating a Simulation code based on Erlang (B) formula to draw graphs which show two curves for each graph; one of simulation and the other of calculated. These curves represent the relationships between average number of calls and average call duration with the probability of call blocking. This simulation code facilitates to select the appropriate parameters for circuit switched networks.Keywords: Erlang B formula, call blocking, telephone system dimension, Markov model, link capacity
Procedia PDF Downloads 569