Search results for: probability and statistics.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 952

Search results for: probability and statistics.

742 Probabilistic Modeling of Network-induced Delays in Networked Control Systems

Authors: Manoj Kumar, A.K. Verma, A. Srividya

Abstract:

Time varying network induced delays in networked control systems (NCS) are known for degrading control system-s quality of performance (QoP) and causing stability problems. In literature, a control method employing modeling of communication delays as probability distribution, proves to be a better method. This paper focuses on modeling of network induced delays as probability distribution. CAN and MIL-STD-1553B are extensively used to carry periodic control and monitoring data in networked control systems. In literature, methods to estimate only the worst-case delays for these networks are available. In this paper probabilistic network delay model for CAN and MIL-STD-1553B networks are given. A systematic method to estimate values to model parameters from network parameters is given. A method to predict network delay in next cycle based on the present network delay is presented. Effect of active network redundancy and redundancy at node level on network delay and system response-time is also analyzed.

Keywords: NCS (networked control system), delay analysis, response-time distribution, worst-case delay, CAN, MIL-STD-1553B, redundancy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
741 A Markov Chain Model for Load-Balancing Based and Service Based RAT Selection Algorithms in Heterogeneous Networks

Authors: Abdallah Al Sabbagh

Abstract:

Next Generation Wireless Network (NGWN) is expected to be a heterogeneous network which integrates all different Radio Access Technologies (RATs) through a common platform. A major challenge is how to allocate users to the most suitable RAT for them. An optimized solution can lead to maximize the efficient use of radio resources, achieve better performance for service providers and provide Quality of Service (QoS) with low costs to users. Currently, Radio Resource Management (RRM) is implemented efficiently for the RAT that it was developed. However, it is not suitable for a heterogeneous network. Common RRM (CRRM) was proposed to manage radio resource utilization in the heterogeneous network. This paper presents a user level Markov model for a three co-located RAT networks. The load-balancing based and service based CRRM algorithms have been studied using the presented Markov model. A comparison for the performance of load-balancing based and service based CRRM algorithms is studied in terms of traffic distribution, new call blocking probability, vertical handover (VHO) call dropping probability and throughput.

Keywords: Heterogeneous Wireless Network, Markov chain model, load-balancing based and service based algorithm, CRRM algorithms, Beyond 3G network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2440
740 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models

Authors: Ramin Vafadary, Maryam Khanbaghi

Abstract:

Forecasting electricity load is important for various purposes like planning, operation and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria namely, the Mean Absolute Error and Root Mean Square Error. The National Renewable Energy Laboratory (NREL) residential energy consumption data are used to train the models. The results of this study show that SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts we can improve the robustness of the models for 24 hour ahead electricity load forecasting.

Keywords: Bagging, Fbprophet, Holt-Winters, LSTM, Load Forecast, SARIMA, tensorflow probability, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 403
739 Gender Dimension of Migrations Influenced by Genocide and Feminicides around the Globe

Authors: Lejla Mušić

Abstract:

Gender dimension of migration analyzes the intersection in between the world statistics on male and female migrations, around the world, involving the questions of youth migrations. Comparative analyses of world migration statistics as methodology offer the insight into the position of women in labor market around world. There are different forms of youth debris in contemporary world. The main problems are illegal migration, feminization of poverty, kidnapping the girls in Nigeria, femicides in Juarez and Mexico. Illegal migrations involve forced labor, rape and prostitution. Transgender youth share ideas through the online media (anti-bullying videos) and develop their own styles such as anarcho-punk, rave, or rock. Therefore, the stronger gender equality laws and laws for protection of women on work should be enforced.

Keywords: Hyper feminization, rape, gangs of girls, rent boys masculinities, Varoç in Istanbul.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758
738 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: Behavior, big data, hierarchical Hidden Markov Model, intelligent object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 711
737 Extreme Rainfall Frequency Analysis for Meteorological Sub-Division 4 of India Using L-Moments

Authors: Th. Arti Devi, Parthasarthi Choudhury

Abstract:

Extreme rainfall frequency analysis for Meteorological Sub-Division 4 of India was analyzed using L-moments approach. Serial Correlation and Mann Kendall tests were conducted for checking serially independent and stationarity of the observations. The discordancy measure for the sites was conducted to detect the discordant sites. The regional homogeneity was tested by comparing with 500 generated homogeneous regions using a 4 parameter Kappa distribution. The best fit distribution was selected based on ZDIST statistics and L-moments ratio diagram from the five extreme value distributions GPD, GLO, GEV, P3 and LP3. The LN3 distribution was selected and regional rainfall frequency relationship was established using index-rainfall procedure. A regional mean rainfall relationship was developed using multiple linear regression with latitude and longitude of the sites as variables.

Keywords: L-moments, ZDIST statistics, Serial correlation, Mann Kendall test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2418
736 Effect of Atmospheric Turbulence on Hybrid FSO/RF Link Availability under Qatar Harsh Climate

Authors: Abir Touati, Syed Jawad Hussain, Farid Touati, Ammar Bouallegue

Abstract:

Although there has been a growing interest in the hybrid free-space optical link and radio frequency FSO/RF communication system, the current literature is limited to results obtained in moderate or cold environment. In this paper, using a soft switching approach, we investigate the effect of weather inhomogeneities on the strength of turbulence hence the channel refractive index under Qatar harsh environment and their influence on the hybrid FSO/RF availability. In this approach, either FSO/RF or simultaneous or none of them can be active. Based on soft switching approach and a finite state Markov Chain (FSMC) process, we model the channel fading for the two links and derive a mathematical expression for the outage probability of the hybrid system. Then, we evaluate the behavior of the hybrid FSO/RF under hazy and harsh weather. Results show that the FSO/RF soft switching renders the system outage probability less than that of each link individually. A soft switching algorithm is being implemented on FPGAs using Raptor code interfaced to the two terminals of a 1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented in the region. Experimental results are compared to the above simulation results.

Keywords: Atmospheric turbulence, haze, soft switching, Raptor codes, refractive index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2525
735 Blind Impulse Response Identification of Frequency Radio Channels: Application to Bran A Channel

Authors: S. Safi, M. Frikel, M. M'Saad, A. Zeroual

Abstract:

This paper describes a blind algorithm for estimating a time varying and frequency selective fading channel. In order to identify blindly the impulse response of these channels, we have used Higher Order Statistics (HOS) to build our algorithm. In this paper, we have selected two theoretical frequency selective channels as the Proakis-s 'B' channel and the Macchi-s channel, and one practical frequency selective fading channel called Broadband Radio Access Network (BRAN A). The simulation results in noisy environment and for different data input channel, demonstrate that the proposed method could estimate the phase and magnitude of these channels blindly and without any information about the input, except that the input excitation is i.i.d (Identically and Independent Distributed) and non-Gaussian.

Keywords: Frequency response, system identification, higher order statistics, communication channels, phase estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1789
734 Energy Detection Based Sensing and Primary User Traffic Classification for Cognitive Radio

Authors: Urvee B. Trivedi, U. D. Dalal

Abstract:

As wireless communication services grow quickly; the seriousness of spectrum utilization has been on the rise gradually. An emerging technology, cognitive radio has come out to solve today’s spectrum scarcity problem. To support the spectrum reuse functionality, secondary users are required to sense the radio frequency environment, and once the primary users are found to be active, the secondary users are required to vacate the channel within a certain amount of time. Therefore, spectrum sensing is of significant importance. Once sensing is done, different prediction rules apply to classify the traffic pattern of primary user. Primary user follows two types of traffic patterns: periodic and stochastic ON-OFF patterns. A cognitive radio can learn the patterns in different channels over time. Two types of classification methods are discussed in this paper, by considering edge detection and by using autocorrelation function. Edge detection method has a high accuracy but it cannot tolerate sensing errors. Autocorrelation-based classification is applicable in the real environment as it can tolerate some amount of sensing errors.

Keywords: Cognitive radio (CR), probability of detection (PD), probability of false alarm (PF), primary User (PU), secondary user (SU), Fast Fourier transform (FFT), signal to noise ratio (SNR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431
733 Analysis of the Evolution of Social and Economic Indicators of the Mercosur´s Members: 1980-2012

Authors: L. Aparecida Bastos, J. Leige Lopes, J. Crepaldi, R. Monteiro da Silva

Abstract:

The objective of this study is to analyze the evolution of some social and economic indicators of Mercosur´s economies from 1980 to 2012, based on the statistics of the Latin American Integration Association (LAIA). The objective is to observe if after the accession of these economies to Mercosur (the first accessions occurred in 1994) these indicators showed better performance, in order to demonstrate if economic integration contributed to improved trade, macroeconomic performance, and level of social and economic development of member countries. To this end, the methodologies used will be a literature review and descriptive statistics. The theoretical framework that guides the work are the theories of Integration: Classical Liberal, Marxist and structural-proactive. The results reveal that most social and economic indicators showed better performance in those economies that joined Mercosur after 1994. This work is the result of an investigation already completed.

Keywords: Economic integration, mercosur, social indicators, economic indicators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1178
732 Mining Network Data for Intrusion Detection through Naïve Bayesian with Clustering

Authors: Dewan Md. Farid, Nouria Harbi, Suman Ahmmed, Md. Zahidur Rahman, Chowdhury Mofizur Rahman

Abstract:

Network security attacks are the violation of information security policy that received much attention to the computational intelligence society in the last decades. Data mining has become a very useful technique for detecting network intrusions by extracting useful knowledge from large number of network data or logs. Naïve Bayesian classifier is one of the most popular data mining algorithm for classification, which provides an optimal way to predict the class of an unknown example. It has been tested that one set of probability derived from data is not good enough to have good classification rate. In this paper, we proposed a new learning algorithm for mining network logs to detect network intrusions through naïve Bayesian classifier, which first clusters the network logs into several groups based on similarity of logs, and then calculates the prior and conditional probabilities for each group of logs. For classifying a new log, the algorithm checks in which cluster the log belongs and then use that cluster-s probability set to classify the new log. We tested the performance of our proposed algorithm by employing KDD99 benchmark network intrusion detection dataset, and the experimental results proved that it improves detection rates as well as reduces false positives for different types of network intrusions.

Keywords: Clustering, detection rate, false positive, naïveBayesian classifier, network intrusion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5495
731 Influence of Local Soil Conditions on Optimal Load Factors for Seismic Design of Buildings

Authors: Miguel A. Orellana, Sonia E. Ruiz, Juan Bojórquez

Abstract:

Optimal load factors (dead, live and seismic) used for the design of buildings may be different, depending of the seismic ground motion characteristics to which they are subjected, which are closely related to the type of soil conditions where the structures are located. The influence of the type of soil on those load factors, is analyzed in the present study. A methodology that is useful for establishing optimal load factors that minimize the cost over the life cycle of the structure is employed; and as a restriction, it is established that the probability of structural failure must be less than or equal to a prescribed value. The life-cycle cost model used here includes different types of costs. The optimization methodology is applied to two groups of reinforced concrete buildings. One set (consisting on 4-, 7-, and 10-story buildings) is located on firm ground (with a dominant period Ts=0.5 s) and the other (consisting on 6-, 12-, and 16-story buildings) on soft soil (Ts=1.5 s) of Mexico City. Each group of buildings is designed using different combinations of load factors. The statistics of the maximums inter-story drifts (associated with the structural capacity) are found by means of incremental dynamic analyses. The buildings located on firm zone are analyzed under the action of 10 strong seismic records, and those on soft zone, under 13 strong ground motions. All the motions correspond to seismic subduction events with magnitudes M=6.9. Then, the structural damage and the expected total costs, corresponding to each group of buildings, are estimated. It is concluded that the optimal load factors combination is different for the design of buildings located on firm ground than that for buildings located on soft soil.

Keywords: Life-cycle cost, optimal load factors, reinforced concrete buildings, total costs, type of soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 861
730 Investigation of the Main Trends of Tourist Expenses in Georgia

Authors: Nino Abesadze, Marine Mindorashvili, Nino Paresashvili

Abstract:

The main purpose of the article is to make complex statistical analysis of tourist expenses of foreign visitors. We used mixed technique of selection that implies rules of random and proportional selection. Computer software SPSS was used to compute statistical data for corresponding analysis. Corresponding methodology of tourism statistics was implemented according to international standards. Important information was collected and grouped from the major Georgian airports. Techniques of statistical observation were prepared. A representative population of foreign visitors and a rule of selection of respondents were determined. We have a trend of growth of tourist numbers and share of tourists from post-soviet countries constantly increases. Level of satisfaction with tourist facilities and quality of service has grown, but still we have a problem of disparity between quality of service and prices. The design of tourist expenses of foreign visitors is diverse; competitiveness of tourist products of Georgian tourist companies is higher.

Keywords: Tourist, expenses, methods, statistics, analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 900
729 A Novel Multiresolution based Optimization Scheme for Robust Affine Parameter Estimation

Authors: J.Dinesh Peter

Abstract:

This paper describes a new method for affine parameter estimation between image sequences. Usually, the parameter estimation techniques can be done by least squares in a quadratic way. However, this technique can be sensitive to the presence of outliers. Therefore, parameter estimation techniques for various image processing applications are robust enough to withstand the influence of outliers. Progressively, some robust estimation functions demanding non-quadratic and perhaps non-convex potentials adopted from statistics literature have been used for solving these. Addressing the optimization of the error function in a factual framework for finding a global optimal solution, the minimization can begin with the convex estimator at the coarser level and gradually introduce nonconvexity i.e., from soft to hard redescending non-convex estimators when the iteration reaches finer level of multiresolution pyramid. Comparison has been made to find the performance of the results of proposed method with the results found individually using two different estimators.

Keywords: Image Processing, Affine parameter estimation, Outliers, Robust Statistics, Robust M-estimators

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1409
728 Factor Driving Consumer Intention in Online Shopping

Authors: Wanida Suwunniponth

Abstract:

The objectives of this research paper was to study the influencing factors that contributed the willingness of consumers to purchase products online included quality of website, perceived ease of use, perceived usefulness, trust on online purchases, attitude towards online shopping and intentions to online purchases. The research was conducted in both quantitative and qualitative methods, by utilizing both questionnaire and in-depth interview. A questionnaire was used to collect data from 350 consumers who had online shopping experiences in Bangkok, Thailand. Statistics utilized in this research included descriptive statistics and path analysis.

The findings revealed that the factors concerning with quality of website, perceived ease of use and perceived usefulness played an influence on trust in online shopping. Trust also played an influence on attitude towards online purchase, whereas trust and attitude towards online purchase manipulated the intention of online purchase.

Keywords: E-commerce, intention, online shopping, Technology acceptance model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6359
727 Multipurpose Cadastre, Essential for Urban Development Plans in Iran

Authors: Mehrshad Khalaj, Elham Lashkari

Abstract:

Majority of researches conducted on Iranian urban development plans indicate that they have been almost unsuccessful in terms of draft, execution and goal achievement. Lack or shortage of essential statistics and information can be listed as an important reason of the failure of these plans. Lack of figures and information has turned into an obvious part of the country-s statistics officials. This problem has made urban planner themselves to embark on physical surveys including real estate and land pricing, population and economic census of the city. Apart from the problems facing urban developers, the possibility of errors is high in such surveys. In the present article, applying the interview technique, it has been mentioned that utilizing multipurpose cadastre system as a land information system is essential for urban development plans in Iran. It can minimize or even remove the failures facing urban development plans.

Keywords: Multipurpose Cadastre, Urban Development Plan(UDP), Land Information System (LIS), Interview Technique

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2400
726 Computational Methods in Official Statistics with an Example on Calculating and Predicting Diabetes Mellitus [DM] Prevalence in Different Age Groups within Australia in Future Years, in Light of the Aging Population

Authors: D. Hilton

Abstract:

An analysis of the Australian Diabetes Screening Study estimated undiagnosed diabetes mellitus [DM] prevalence in a high risk general practice based cohort. DM prevalence varied from 9.4% to 18.1% depending upon the diagnostic criteria utilised with age being a highly significant risk factor. Utilising the gold standard oral glucose tolerance test, the prevalence of DM was 22-23% in those aged >= 70 years and <15% in those aged 40-59 years. Opportunistic screening in Australian general practice potentially can identify many persons with undiagnosed type 2 DM. An Australian Bureau of Statistics document published three years ago, reported the highest rate of DM in men aged 65-74 years [19%] whereas the rate for women was highest in those over 75 years [13%]. If you consider that the Australian Bureau of Statistics report in 2007 found that 13% of the population was over 65 years of age and that this will increase to 23-25% by 2056 with a further projected increase to 25-28% by 2101, obviously this information has to be factored into the equation when age related diabetes prevalence predictions are calculated. This 10-15% proportional increase of elderly persons within the population demographics has dramatic implications for the estimated number of elderly persons with DM in these age groupings. Computational methodology showing the age related demographic changes reported in these official statistical documents will be done showing estimates for 2056 and 2101 for different age groups. This has relevance for future diabetes prevalence rates and shows that along with many countries worldwide Australia is facing an increasing pandemic. In contrast Japan is expected to have a decrease in the next twenty years in the number of persons with diabetes.

Keywords: Epidemiological methods, aging, prevalence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
725 Force Statistics and Wake Structure Mechanism of Flow around a Square Cylinder at Low Reynolds Numbers

Authors: Shams-Ul-Islam, Waqas Sarwar Abbasi, Hamid Rahman

Abstract:

Numerical investigation of flow around a square cylinder are presented using the multi-relaxation-time lattice Boltzmann methods at different Reynolds numbers. A detail analysis are given in terms of time-trace analysis of drag and lift coefficients, power spectra analysis of lift coefficient, vorticity contours visualizations, streamlines and phase diagrams. A number of physical quantities mean drag coefficient, drag coefficient, Strouhal number and root-mean-square values of drag and lift coefficients are calculated and compared with the well resolved experimental data and numerical results available in open literature. The Reynolds numbers affected the physical quantities.

Keywords: Code validation, Force statistics, Multi-relaxation-time lattice Boltzmann method, Reynolds numbers, Square cylinder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3071
724 Perspectives of Renewable Energy in 21st Century in India: Statistics and Estimation

Authors: Manoj Kumar, Rajesh Kumar

Abstract:

With the favourable geographical conditions at Indian-subcontinent, it is suitable for flourishing renewable energy. Increasing amount of dependence on coal and other conventional sources is driving the world into pollution and depletion of resources. This paper presents the statistics of energy consumption and energy generation in Indian Sub-continent, which notifies us with the increasing energy demands surpassing energy generation. With the aggrandizement in demand for energy, usage of coal has increased, since the major portion of energy production in India is from thermal power plants. The increase in usage of thermal power plants causes pollution and depletion of reserves; hence, a paradigm shift to renewable sources is inevitable. In this work, the capacity and potential of renewable sources in India are analyzed. Based on the analysis of this work, future potential of these sources is estimated.

Keywords: Energy consumption and generation, depletion of reserves, pollution, estimation, renewable sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 772
723 Performance Evaluation of a Prioritized, Limited Multi-Server Processor-Sharing System That Includes Servers with Various Capacities

Authors: Yoshiaki Shikata, Nobutane Hanayama

Abstract:

We present a prioritized, limited multi-server processor sharing (PS) system where each server has various capacities, and N (≥2) priority classes are allowed in each PS server. In each prioritized, limited server, different service ratio is assigned to each class request, and the number of requests to be processed is limited to less than a certain number. Routing strategies of such prioritized, limited multi-server PS systems that take into account the capacity of each server are also presented, and a performance evaluation procedure for these strategies is discussed. Practical performance measures of these strategies, such as loss probability, mean waiting time, and mean sojourn time, are evaluated via simulation. In the PS server, at the arrival (or departure) of a request, the extension (shortening) of the remaining sojourn time of each request receiving service can be calculated by using the number of requests of each class and the priority ratio. Utilising a simulation program which executes these events and calculations, the performance of the proposed prioritized, limited multi-server PS rule can be analyzed. From the evaluation results, most suitable routing strategy for the loss or waiting system is clarified.

Keywords: Processor sharing, multi-server, various capacity, N priority classes, routing strategy, loss probability, mean sojourn time, mean waiting time, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000
722 Motivation Factors to Influence the Decision to Choose Thai Fabric

Authors: Pisit Potjanajaruwit

Abstract:

The purpose of this research was to study the motivation factors to influence the decision to choose Thai Fabric. A multiple-stage sample was utilized to collect 400 samples from working women who had diverse occupations all over Thailand. This research was a quantitative analysis and questionnaire was used a tool to collect data. Descriptive statistics used in this research included percentage, average, and standard deviation and inferential statistics included hypothesis testing of one way ANOVA. The research findings revealed that demographic factors and social factors had an influence to the positive idea of wearing Thai fabric (F = 5.377, P value < 0.05). The respondents who had the age over 41 years old had a better positive idea of wearing Thai fabric than other groups. Moreover, the findings revealed that age had influenced the positive idea of wearing Thai fabric (F = 3.918, P value < 0.05). The respondents who had the age over 41 years old also had stronger believe that wearing Thai fabric to work and social gatherings are socially acceptable than other groups.

Keywords: Decision, Motivation, Influence, Thai Fabric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
721 The Effectiveness of ICT-Assisted PBL on College-Level Nano Knowledge and Learning Skills

Authors: Ya-Ting Carolyn Yang, Ping-Han Cheng, Shi-Hui Gilbert Chang, Terry Yuan-Fang Chen, Chih-Chieh Li

Abstract:

Nanotechnology is widely applied in various areas so professionals in the related fields have to know more than nano knowledge. In the study, we focus on adopting ICT-assisted PBL in college general education to foster professionals who possess multiple abilities. The research adopted a pretest and posttest quasi-experimental design. The control group received traditional instruction, and the experimental group received ICT-assisted PBL instruction. Descriptive statistics will be used to describe the means, standard deviations, and adjusted means for the tests between the two groups. Next, analysis of covariance (ANCOVA) will be used to compare the final results of the two research groups after 6 weeks of instruction. Statistics gathered in the end of the research can be used to make contrasts. Therefore, we will see how different teaching strategies can improve students’ understanding about nanotechnology and learning skills.

Keywords: Nanotechnology, science education, project-based learning, information and communication technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2032
720 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: Probabilistic methods, risk assessment, risk management, slope stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
719 A Secure Auditing Framework for Load Balancing in Cloud Environment

Authors: R. Geetha, T. Padmavathy

Abstract:

Security audit is an important aspect or feature to be considered in cloud service customer. It is basically a certification process to audit the controls that deliver the security requirements. Security audits are conducted by trained and qualified staffs that belong to an independent auditing organization. Security audits must be carried as a standard of security controls. Proper check to be made that the cloud user has a proper reporting and logging facilities with the customer's system and hence ensuring appropriate business and operational flow of data through cloud service. We propose a cloud-based secure auditing framework, which enables confided in power to safely store their mystery information on the semi-believed cloud specialist co-ops, and specifically share their mystery information with a wide scope of information recipient, to diminish the key administration intricacy for power proprietors and information collectors. Unique in relation to past cloud-based information framework, data proprietors transfer their mystery information into cloud utilizing static and dynamic evaluating plan. Another propelled determination is, if any information beneficiary needs individual record to download, the information collector will send the solicitation to the expert. The specialist proprietor has the Access Control. At the off probability, the businessman must impart the primary record to the knowledge collector, acknowledge statistics beneficiary solicitation. Once the acknowledgement for the records is over, the recipient downloads the first record and this record shifting time with date and downloading time with date are monitored by the inspector. In addition to deduplication concept, diminished cloud memory area using dynamic document distribution has been proposed.

Keywords: Cloud computing, cloud storage auditing, data integrity, key exposure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
718 Performance Evaluation of a Limited Round-Robin System

Authors: Yoshiaki Shikata

Abstract:

Performance of a limited Round-Robin (RR) rule is studied in order to clarify the characteristics of a realistic sharing model of a processor. Under the limited RR rule, the processor allocates to each request a fixed amount of time, called a quantum, in a fixed order. The sum of the requests being allocated these quanta is kept below a fixed value. Arriving requests that cannot be allocated quanta because of such a restriction are queued or rejected. Practical performance measures, such as the relationship between the mean sojourn time, the mean number of requests, or the loss probability and the quantum size are evaluated via simulation. In the evaluation, the requested service time of an arriving request is converted into a quantum number. One of these quanta is included in an RR cycle, which means a series of quanta allocated to each request in a fixed order. The service time of the arriving request can be evaluated using the number of RR cycles required to complete the service, the number of requests receiving service, and the quantum size. Then an increase or decrease in the number of quanta that are necessary before service is completed is reevaluated at the arrival or departure of other requests. Tracking these events and calculations enables us to analyze the performance of our limited RR rule. In particular, we obtain the most suitable quantum size, which minimizes the mean sojourn time, for the case in which the switching time for each quantum is considered.

Keywords: Limited RR rule, quantum, processor sharing, sojourn time, performance measures, simulation, loss probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1213
717 Spectral Amplitude Coding Optical CDMA: Performance Analysis of PIIN Reduction Using VC Code Family

Authors: Hassan Yousif Ahmed, Ibrahima Faye, N.M.Saad, S.A. Aljined

Abstract:

Multi-user interference (MUI) is the main reason of system deterioration in the Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA) system. MUI increases with the number of simultaneous users, resulting into higher probability bit rate and limits the maximum number of simultaneous users. On the other hand, Phase induced intensity noise (PIIN) problem which is originated from spontaneous emission of broad band source from MUI severely limits the system performance should be addressed as well. Since the MUI is caused by the interference of simultaneous users, reducing the MUI value as small as possible is desirable. In this paper, an extensive study for the system performance specified by MUI and PIIN reducing is examined. Vectors Combinatorial (VC) codes families are adopted as a signature sequence for the performance analysis and a comparison with reported codes is performed. The results show that, when the received power increases, the PIIN noise for all the codes increases linearly. The results also show that the effect of PIIN can be minimized by increasing the code weight leads to preserve adequate signal to noise ratio over bit error probability. A comparison study between the proposed code and the existing codes such as Modified frequency hopping (MFH), Modified Quadratic- Congruence (MQC) has been carried out.

Keywords: FBG, MUI, PIIN, SAC-OCDMA, VCC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2166
716 Error Rate Probability for Coded MQAM with MRC Diversity in the Presence of Cochannel Interferers over Nakagami-Fading Channels

Authors: J.S. Ubhi, M.S. Patterh, T.S. Kamal

Abstract:

Exact expressions for bit-error probability (BEP) for coherent square detection of uncoded and coded M-ary quadrature amplitude modulation (MQAM) using an array of antennas with maximal ratio combining (MRC) in a flat fading channel interference limited system in a Nakagami-m fading environment is derived. The analysis assumes an arbitrary number of independent and identically distributed Nakagami interferers. The results for coded MQAM are computed numerically for the case of (24,12) extended Golay code and compared with uncoded MQAM by plotting error probabilities versus average signal-to-interference ratio (SIR) for various values of order of diversity N, number of distinct symbols M, in order to examine the effect of cochannel interferers on the performance of the digital communication system. The diversity gains and net gains are also presented in tabular form in order to examine the performance of digital communication system in the presence of interferers, as the order of diversity increases. The analytical results presented in this paper are expected to provide useful information needed for design and analysis of digital communication systems with space diversity in wireless fading channels.

Keywords: Cochannel interference, maximal ratio combining, Nakagami-m fading, wireless digital communications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
715 A Mobile Multihop Relay Dynamic TDD Scheme for Cellular Networks

Authors: Jong-Moon Chung, Hyung-Weon Cho, Ki-Yong Jin, Min-Hee Cho

Abstract:

In this paper, we present an analytical framework for the evaluation of the uplink performance of multihop cellular networks based on dynamic time division duplex (TDD). New wireless broadband protocols, such as WiMAX, WiBro, and 3G-LTE apply TDD, and mobile communication protocols under standardization (e.g., IEEE802.16j) are investigating mobile multihop relay (MMR) as a future technology. In this paper a novel MMR TDD scheme is presented, where the dynamic range of the frame is shared to traffic resources of asymmetric nature and multihop relaying. The mobile communication channel interference model comprises of inner and co-channel interference (CCI). The performance analysis focuses on the uplink due to the fact that the effects of dynamic resource allocation show significant performance degradation only in the uplink compared to time division multiple access (TDMA) schemes due to CCI [1-3], where the downlink results to be the same or better.The analysis was based on the signal to interference power ratio (SIR) outage probability of dynamic TDD (D-TDD) and TDMA systems,which are the most widespread mobile communication multi-user control techniques. This paper presents the uplink SIR outage probability with multihop results and shows that the dynamic TDD scheme applying MMR can provide a performance improvement compared to single hop applications if executed properly.

Keywords: Co-Channel Interference, Dynamic TDD, MobileMultihop Reply, Cellular Network, Time Division Multiple Access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2303
714 Pragati Node Popularity (PNP) Approach to Identify Congestion Hot Spots in MPLS

Authors: E. Ramaraj, A. Padmapriya

Abstract:

In large Internet backbones, Service Providers typically have to explicitly manage the traffic flows in order to optimize the use of network resources. This process is often referred to as Traffic Engineering (TE). Common objectives of traffic engineering include balance traffic distribution across the network and avoiding congestion hot spots. Raj P H and SVK Raja designed the Bayesian network approach to identify congestion hors pots in MPLS. In this approach for every node in the network the Conditional Probability Distribution (CPD) is specified. Based on the CPD the congestion hot spots are identified. Then the traffic can be distributed so that no link in the network is either over utilized or under utilized. Although the Bayesian network approach has been implemented in operational networks, it has a number of well known scaling issues. This paper proposes a new approach, which we call the Pragati (means Progress) Node Popularity (PNP) approach to identify the congestion hot spots with the network topology alone. In the new Pragati Node Popularity approach, IP routing runs natively over the physical topology rather than depending on the CPD of each node as in Bayesian network. We first illustrate our approach with a simple network, then present a formal analysis of the Pragati Node Popularity approach. Our PNP approach shows that for any given network of Bayesian approach, it exactly identifies the same result with minimum efforts. We further extend the result to a more generic one: for any network topology and even though the network is loopy. A theoretical insight of our result is that the optimal routing is always shortest path routing with respect to some considerations of hot spots in the networks.

Keywords: Conditional Probability Distribution, Congestion hotspots, Operational Networks, Traffic Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
713 Workstation Design Based On Ergonomics in Animal Feed Packing Process

Authors: Pirutchada Musigapong, Wantanee Phanprasit

Abstract:

The intention of this study to design the probability optimized sewing sack-s workstation based on ergonomics for productivity improvement and decreasing musculoskeletal disorders. The physical dimensions of two workers were using to design the new workstation. The physical dimensions are (1) sitting height, (2) mid shoulder height sitting, (3) shoulder breadth, (4) knee height, (5) popliteal height, (6) hip breadth and (7) buttock-knee length. The 5th percentile of buttock knee length sitting (51 cm), the 50th percentile of mid shoulder height sitting (62 cm) and the 95th percentile of popliteal height (43 cm) and hip breadth (45 cm) applied to design the workstation for sewing sack-s operator and the others used to adjust the components of this workstation. The risk assessment by RULA before and after using the probability optimized workstation were 7 and 7 scores and REBA scores were 11 and 5, respectively. Body discomfort-abnormal index was used to assess muscle fatigue of operators before adjustment workstation found that neck muscles, arm muscles area, muscles on the back and the lower back muscles fatigue. Therefore, the extension and flexion exercise was applied to relief musculoskeletal stresses. The workers exercised 15 minutes before the beginning and the end of work for 5 days. After that, the capability of flexion and extension muscles- workers were increasing in 3 muscles (arm, leg, and back muscles).

Keywords: Animal feed, anthropometry, ergonomics, sewing sack, workstation design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2388