Search results for: error masking probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3066

Search results for: error masking probability

1446 A Syntactic Errors Analysis in the Malaysian ESL Learners' Written Composition

Authors: Annie Gedion, Johan Severinus Tati, Jacinta Caroline Peter

Abstract:

Syntax error analysis studies have a significant role in English language teaching especially in the second language. This study investigates the syntax errors in written composition by 50 multilingual ESL learners in Politeknik Kota Kinabalu Sabah, Malaysia. The subjects speak their own dialect, Malay as their second language and English as their third or foreign language. Data were collected from the written discourse in the form of descriptive essays. The subjects were asked to write in the classroom within 45 minutes. 15 categories of errors were classified into a set of syntactic categories and were analysed based on the five steps of the syntactic analysis procedure. The findings of the study showed that the mother tongue interference, as well as lack of vocabulary and grammar knowledge, were the major sources of syntax errors in the learners’ written composition. Learners should be exposed to the differentiation of Malay and English grammar to avoid interference and effective learning of second language writing.

Keywords: errors analysis, syntactic analysis, English as a second language, ESL writing

Procedia PDF Downloads 275
1445 Integration of Technology in Business Education: Emerging Voices from Business Education Classrooms in Nigeria Secondary Schools

Authors: Clinton Chidiebere Anyanwu

Abstract:

Secondary education is a vital part of a virtuous circle of economic growth within the context of a globalised knowledge economy. The teaching of Business Education entails teaching learners the essentials, rudiments, assumptions, and methods of business. Hence, it was deemed necessary for the study to investigate technology integration in Business Education. Drawing from the theoretical frameworks of technological pedagogical content knowledge (TPACK), and unified theory of acceptance and use of technology (UTAUT), the study observes teachers’ level of technology use in Business Education classrooms. Using a mixed-methods sequential explanatory design, probability, and purposive sampling, the majority of participants were found to be not integrating technology to an acceptable level and a small percentage was. After an analysis of constructs from UTAUT, some of this could be attributed to the lack of facilitating conditions in the teaching and learning of Business Education. The implication of the study findings is that poor investment in technology integration in secondary schools in Nigeria affects pedagogical implementations and effective teaching and learning of Business Education subjects. The study concludes that if facilitating conditions and professional development are considered to address the shortfalls in terms of TPACK, technology integration will become a reality in secondary schools in Nigeria.

Keywords: business education, secondary education, technology integration, TPACK, UTAUT

Procedia PDF Downloads 195
1444 Design of a Pneumonia Ontology for Diagnosis Decision Support System

Authors: Sabrina Azzi, Michal Iglewski, Véronique Nabelsi

Abstract:

Diagnosis error problem is frequent and one of the most important safety problems today. One of the main objectives of our work is to propose an ontological representation that takes into account the diagnostic criteria in order to improve the diagnostic. We choose pneumonia disease since it is one of the frequent diseases affected by diagnosis errors and have harmful effects on patients. To achieve our aim, we use a semi-automated method to integrate diverse knowledge sources that include publically available pneumonia disease guidelines from international repositories, biomedical ontologies and electronic health records. We follow the principles of the Open Biomedical Ontologies (OBO) Foundry. The resulting ontology covers symptoms and signs, all the types of pneumonia, antecedents, pathogens, and diagnostic testing. The first evaluation results show that most of the terms are covered by the ontology. This work is still in progress and represents a first and major step toward a development of a diagnosis decision support system for pneumonia.

Keywords: Clinical decision support system, Diagnostic errors, Ontology, Pneumonia

Procedia PDF Downloads 171
1443 A Study on the Safety Evaluation of Pier According to the Water Level Change by the Monte-Carlo Method

Authors: Minho Kwon, Jeonghee Lim, Yeongseok Jeong, Donghoon Shin, Kiyoung Kim

Abstract:

Recently, global warming phenomenon has led to natural disasters caused by global environmental changes, and due to abnormal weather events, the frequency and intensity of heavy rain storm typhoons are increasing. Therefore, it is imperative to prepare for future heavy rain storms and typhoons. This study selects arbitrary target bridges and performs numerical analysis to evaluate the safety of bridge piers in the event that the water level changes. The numerical model is based on two-dimensional surface elements. Actual reinforced concrete was simulated by modeling concrete to include reinforcements, and a contact boundary model was applied between the ground and the concrete. The water level applied to the piers was considered at 18 levels between 7.5 m and 16.1 m. The elastic modulus, compressive strength, tensile strength, and yield strength of the reinforced concrete were calculated using 250 random combinations and numerical analysis was carried out for each water level. In the results of analysis, the bridge exceeded the stated limit at 15.0 m. At the maximum water level of 16.1m, the concrete’s failure rate was 35.2%, but the probability that the reinforcement would fail was 61.2%.

Keywords: Monte-Carlo method, pier, water level change, limit state

Procedia PDF Downloads 278
1442 Neural Network Based Compressor Flow Estimator in an Aircraft Vapor Cycle System

Authors: Justin Reverdi, Sixin Zhang, Serge Gratton, Said Aoues, Thomas Pellegrini

Abstract:

In Vapor Cycle Systems, the flow sensor plays a key role in different monitoring and control purposes. However, physical sensors can be expensive, inaccurate, heavy, cumbersome, or highly sensitive to vibrations, which is especially problematic when embedded into an aircraft. The conception of a virtual sensor based on other standard sensors is a good alternative. In this paper, a data-driven model using a Convolutional Neural Network is proposed to estimate the flow of the compressor. To fit the model to our dataset, we tested different loss functions. We show in our application that a Dynamic Time Warping based loss function called DILATE leads to better dynamical performance than the vanilla mean squared error (MSE) loss function. DILATE allows choosing a trade-off between static and dynamic performance.

Keywords: deep learning, dynamic time warping, vapor cycle system, virtual sensor

Procedia PDF Downloads 137
1441 Modeling Sediment Yield Using the SWAT Model: A Case Study of Upper Ankara River Basin, Turkey

Authors: Umit Duru

Abstract:

The Soil and Water Assessment Tool (SWAT) was tested for prediction of water balance and sediment yield in the Ankara gauged basin, Turkey. The overall objective of this study was to evaluate the performance and applicability of the SWAT in this region of Turkey. Thirteen years of monthly stream flow, and suspended sediment, data were used for calibration and validation. This research assessed model performance based on differences between observed and predicted suspended sediment yield during calibration (1987-1996) and validation (1982-1984) periods. Statistical comparisons of suspended sediment produced values for NSE (Nash Sutcliffe efficiency), RE (relative error), and R² (coefficient of determination), of 0.81, -1.55, and 0.93, respectively, during the calibration period, and NSE, RE (%), and R² of 0.77, -2.61, and 0.87, respectively, during the validation period. Based on the analyses, SWAT satisfactorily simulated observed hydrology and sediment yields and can be used as a tool in decision making for water resources planning and management in the basin.

Keywords: calibration, GIS, sediment yield, SWAT, validation

Procedia PDF Downloads 263
1440 A Data Driven Approach for the Degradation of a Lithium-Ion Battery Based on Accelerated Life Test

Authors: Alyaa M. Younes, Nermine Harraz, Mohammad H. Elwany

Abstract:

Lithium ion batteries are currently used for many applications including satellites, electric vehicles and mobile electronics. Their ability to store relatively large amount of energy in a limited space make them most appropriate for critical applications. Evaluation of the life of these batteries and their reliability becomes crucial to the systems they support. Reliability of Li-Ion batteries has been mainly considered based on its lifetime. However, another important factor that can be considered critical in many applications such as in electric vehicles is the cycle duration. The present work presents the results of an experimental investigation on the degradation behavior of a Laptop Li-ion battery (type TKV2V) and the effect of applied load on the battery cycle time. The reliability was evaluated using an accelerated life test. Least squares linear regression with median rank estimation was used to estimate the Weibull distribution parameters needed for the reliability functions estimation. The probability density function, failure rate and reliability function under each of the applied loads were evaluated and compared. An inverse power model is introduced that can predict cycle time at any stress level given.

Keywords: accelerated life test, inverse power law, lithium-ion battery, reliability evaluation, Weibull distribution

Procedia PDF Downloads 155
1439 Housing Price Prediction Using Machine Learning Algorithms: The Case of Melbourne City, Australia

Authors: The Danh Phan

Abstract:

House price forecasting is a main topic in the real estate market research. Effective house price prediction models could not only allow home buyers and real estate agents to make better data-driven decisions but may also be beneficial for the property policymaking process. This study investigates the housing market by using machine learning techniques to analyze real historical house sale transactions in Australia. It seeks useful models which could be deployed as an application for house buyers and sellers. Data analytics show a high discrepancy between the house price in the most expensive suburbs and the most affordable suburbs in the city of Melbourne. In addition, experiments demonstrate that the combination of Stepwise and Support Vector Machine (SVM), based on the Mean Squared Error (MSE) measurement, consistently outperforms other models in terms of prediction accuracy.

Keywords: house price prediction, regression trees, neural network, support vector machine, stepwise

Procedia PDF Downloads 207
1438 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 131
1437 Hydraulic Analysis of Irrigation Approach Channel Using HEC-RAS Model

Authors: Muluegziabher Semagne Mekonnen

Abstract:

This study was intended to show the irrigation water requirements and evaluation of canal hydraulics steady state conditions to improve on scheme performance of the Meki-Ziway irrigation project. The methodology used was the CROPWAT 8.0 model to estimate the irrigation water requirements of five major crops irrigated in the study area. The results showed that for the whole existing and potential irrigation development area of 2000 ha and 2599 ha, crop water requirements were 3,339,200 and 4,339,090.4 m³, respectively. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. In this study Hydraulic Analysis of Irrigation Canals Using HEC-RAS Model was conducted in Meki-Ziway Irrigation Scheme. The HEC-RAS model was tested in terms of error estimation and used to determine canal capacity potential.

Keywords: HEC-RAS, irrigation, hydraulic. canal reach, capacity

Procedia PDF Downloads 47
1436 A Study of Islamic Stock Indices and Macroeconomic Variables

Authors: Mohammad Irfan

Abstract:

The purpose of this paper is to investigate the relationship among the key macroeconomic variables and Islamic stock market in India. This study is based on the time series data of financial years 2009-2015 to explore the consistency of relationship between macroeconomic variables and Shariah Indices. The ADF (Augmented Dickey–Fuller Test Statistic) and PP (Phillips–Perron Test Statistic) tests are employed to check stationarity of the data. The study depicts the long run relationship between Shariah indices and macroeconomic variables by using the Johansen Co-integration test. BSE Shariah and Nifty Shariah have uni-direct Granger causality. The outcome of VECM is significantly confirming the applicability of best fitted model. Thus, Islamic stock indices are proficiently working for the development of Indian economy. It suggests that by keeping eyes on Islamic stock market which will be more interactive in the future with other macroeconomic variables.

Keywords: Indian Shariah Indices, macroeconomic variables, co-integration, Granger causality, vector error correction model (VECM)

Procedia PDF Downloads 269
1435 Investigation of Roll-Off Factor in Pulse Shaping Filter on Maximal Ratio Combining for CDMA 2000 System

Authors: G. S. Walia, H. P. Singh, D. Padma

Abstract:

The integration of wide variety of communication services is made possible with invention of 3G technology. Code Division Multiple Access 2000 operates on various RF channel bandwidths 1.2288 or 3.6864 Mcps (1x or 3x systems). It is a 3G system which offers high bandwidth and wireless broadband services but its efficiency is lowered due to various factors like fading, interference, scattering, absorption etc. This paper investigates the effect of diversity (MRC), roll off factor in Root Raised Cosine (RRC) filter for the BPSK and QPSK modulation schemes. It is possible to transmit data with minimum Inter symbol Interference and within limited bandwidth with proper pulse shaping technique. Bit error rate (BER) performance is analyzed by applying diversity technique by varying the roll off factor for BPSK and QPSK. Roll off factor reduces the ISI and diversity reduces the Fading.

Keywords: CDMA2000, root raised cosine, roll-off factor, ISI, diversity, interference, fading

Procedia PDF Downloads 394
1434 Leverage Effect for Volatility with Generalized Laplace Error

Authors: Farrukh Javed, Krzysztof Podgórski

Abstract:

We propose a new model that accounts for the asymmetric response of volatility to positive ('good news') and negative ('bad news') shocks in economic time series the so-called leverage effect. In the past, asymmetric powers of errors in the conditionally heteroskedastic models have been used to capture this effect. Our model is using the gamma difference representation of the generalized Laplace distributions that efficiently models the asymmetry. It has one additional natural parameter, the shape, that is used instead of power in the asymmetric power models to capture the strength of a long-lasting effect of shocks. Some fundamental properties of the model are provided including the formula for covariances and an explicit form for the conditional distribution of 'bad' and 'good' news processes given the past the property that is important for the statistical fitting of the model. Relevant features of volatility models are illustrated using S&P 500 historical data.

Keywords: heavy tails, volatility clustering, generalized asymmetric laplace distribution, leverage effect, conditional heteroskedasticity, asymmetric power volatility, GARCH models

Procedia PDF Downloads 371
1433 Granger Causal Nexus between Financial Development and Energy Consumption: Evidence from Cross Country Panel Data

Authors: Rudra P. Pradhan

Abstract:

This paper examines the Granger causal nexus between financial development and energy consumption in the group of 35 Financial Action Task Force (FATF) Countries over the period 1988-2012. The study uses two financial development indicators such as private sector credit and stock market capitalization and seven energy consumption indicators such as coal, oil, gas, electricity, hydro-electrical, nuclear and biomass. Using panel cointegration tests, the study finds that financial development and energy consumption are cointegrated, indicating the presence of a long-run relationship between the two. Using a panel vector error correction model (VECM), the study detects both bidirectional and unidirectional causality between financial development and energy consumption. The variation of this causality is due to the use of different proxies for both financial development and energy consumption. The policy implication of this study is that economic policies should recognize the differences in the financial development-energy consumption nexus in order to maintain sustainable development in the selected 35 FATF countries.

Keywords: energy consumption, financial development, FATF countries, Panel VECM

Procedia PDF Downloads 254
1432 Software Verification of Systematic Resampling for Optimization of Particle Filters

Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey

Abstract:

Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.

Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking

Procedia PDF Downloads 69
1431 Ultra-High Precision Diamond Turning of Infrared Lenses

Authors: Khaled Abou-El-Hossein

Abstract:

The presentation will address the features of two IR convex lenses that have been manufactured using an ultra-high precision machining centre based on single-point diamond turning. The lenses are made from silicon and germanium with a radius of curvature of 500 mm. Because of the brittle nature of silicon and germanium, machining parameters were selected in such a way that ductile regime was achieved. The cutting speed was 800 rpm while the feed rate and depth cut were 20 mm/min and 20 um, respectively. Although both materials comprise a mono-crystalline microstructure and are quite similar in terms of optical properties, machining of silicon was accompanied with more difficulties in terms of form accuracy compared to germanium machining. The P-V error of the silicon profile was 0.222 um while it was only 0.055 um for the germanium lens. This could be attributed to the accelerated wear that takes place on the tool edge when turning mono-crystalline silicon. Currently, we are using other ranges of the machining parameters in order to determine their optimal range that could yield satisfactory performance in terms of form accuracy when fabricating silicon lenses.

Keywords: diamond turning, optical surfaces, precision machining, surface roughness

Procedia PDF Downloads 309
1430 Kinetic Modeling of Transesterification of Triacetin Using Synthesized Ion Exchange Resin (SIERs)

Authors: Hafizuddin W. Yussof, Syamsutajri S. Bahri, Adam P. Harvey

Abstract:

Strong anion exchange resins with QN+OH-, have the potential to be developed and employed as heterogeneous catalyst for transesterification, as they are chemically stable to leaching of the functional group. Nine different SIERs (SIER1-9) with QN+OH- were prepared by suspension polymerization of vinylbenzyl chloride-divinylbenzene (VBC-DVB) copolymers in the presence of n-heptane (pore-forming agent). The amine group was successfully grafted into the polymeric resin beads through functionalization with trimethylamine. These SIERs are then used as a catalyst for the transesterification of triacetin with methanol. A set of differential equations that represents the Langmuir-Hinshelwood-Hougen-Watson (LHHW) and Eley-Rideal (ER) models for the transesterification reaction were developed. These kinetic models of LHHW and ER were fitted to the experimental data. Overall, the synthesized ion exchange resin-catalyzed reaction were well-described by the Eley-Rideal model compared to LHHW models, with sum of square error (SSE) of 0.742 and 0.996, respectively.

Keywords: anion exchange resin, Eley-Rideal, Langmuir-Hinshelwood-Hougen-Watson, transesterification

Procedia PDF Downloads 347
1429 A Field Study of Monochromatic Light Effects on Antibody Responses to Newcastle Disease by HI Test and the Correlation with ELISA

Authors: Seyed Mehrzad Pahlavani, Mozaffar Haji Jafari Anaraki, Sayma Mohammadi

Abstract:

A total of 34700 day-old broilers were exposed to green, blue and yellow light using a light-emitting diode system for 6 weeks to investigate the effects of light wave length on antibody responses to Newcastle disease by HI test and the correlation with ELISA. 3 poultry house broiler farms with the same conditions was selected and the lightening system of each was set according to the requirement. Blood samples were taken from 20 chicks on days 1, 24 and 46 and the Newcastle virus specific antibody was titered in serum using HI an ELISA test. On day 24, the probability value of more than 0/05 was observed in HI and ELISA tests of all groups while at the end of breeding period, the average HI serum antibody titer was more in the green light than the yellow one while the blue light was not significantly different from both. At the last titration, the green light has got the highest titer of Newcastle antibodies. There were no significant differences of Newcastle antibody titers between all groups and ages in broiler pullets in ELISA. According to the sampling and analysis of HI and ELISA serum tests, there were no significant relationships between all broiler pullets breeding in green, blue and yellow light on days 24 and 46 and the P-value was more than 0/05. It is suggested that the monochromatic light is effective on broilers immunity against Newcastle disease.

Keywords: monochromatic light, Newcastle disease, HI test, ELISA test

Procedia PDF Downloads 646
1428 An Efficient Process Analysis and Control Method for Tire Mixing Operation

Authors: Hwang Ho Kim, Do Gyun Kim, Jin Young Choi, Sang Chul Park

Abstract:

Since tire production process is very complicated, company-wide management of it is very difficult, necessitating considerable amounts of capital and labors. Thus, productivity should be enhanced and maintained competitive by developing and applying effective production plans. Among major processes for tire manufacturing, consisting of mixing component preparation, building and curing, the mixing process is an essential and important step because the main component of tire, called compound, is formed at this step. Compound as a rubber synthesis with various characteristics plays its own role required for a tire as a finished product. Meanwhile, scheduling tire mixing process is similar to flexible job shop scheduling problem (FJSSP) because various kinds of compounds have their unique orders of operations, and a set of alternative machines can be used to process each operation. In addition, setup time required for different operations may differ due to alteration of additives. In other words, each operation of mixing processes requires different setup time depending on the previous one, and this kind of feature, called sequence dependent setup time (SDST), is a very important issue in traditional scheduling problems such as flexible job shop scheduling problems. However, despite of its importance, there exist few research works dealing with the tire mixing process. Thus, in this paper, we consider the scheduling problem for tire mixing process and suggest an efficient particle swarm optimization (PSO) algorithm to minimize the makespan for completing all the required jobs belonging to the process. Specifically, we design a particle encoding scheme for the considered scheduling problem, including a processing sequence for compounds and machine allocation information for each job operation, and a method for generating a tire mixing schedule from a given particle. At each iteration, the coordination and velocity of particles are updated, and the current solution is compared with new solution. This procedure is repeated until a stopping condition is satisfied. The performance of the proposed algorithm is validated through a numerical experiment by using some small-sized problem instances expressing the tire mixing process. Furthermore, we compare the solution of the proposed algorithm with it obtained by solving a mixed integer linear programming (MILP) model developed in previous research work. As for performance measure, we define an error rate which can evaluate the difference between two solutions. As a result, we show that PSO algorithm proposed in this paper outperforms MILP model with respect to the effectiveness and efficiency. As the direction for future work, we plan to consider scheduling problems in other processes such as building, curing. We can also extend our current work by considering other performance measures such as weighted makespan or processing times affected by aging or learning effects.

Keywords: compound, error rate, flexible job shop scheduling problem, makespan, particle encoding scheme, particle swarm optimization, sequence dependent setup time, tire mixing process

Procedia PDF Downloads 250
1427 An Enhanced Hybrid Backoff Technique for Minimizing the Occurrence of Collision in Mobile Ad Hoc Networks

Authors: N. Sabiyath Fatima, R. K. Shanmugasundaram

Abstract:

In Mobile Ad-hoc Networks (MANETS), every node performs both as transmitter and receiver. The existing backoff models do not exactly forecast the performance of the wireless network. Also, the existing models experience elevated packet collisions. Every time a collision happens, the station’s contention window (CW) is doubled till it arrives at the utmost value. The main objective of this paper is to diminish collision by means of contention window Multiplicative Increase Decrease Backoff (CWMIDB) scheme. The intention of rising CW is to shrink the collision possibility by distributing the traffic into an outsized point in time. Within wireless Ad hoc networks, the CWMIDB algorithm dynamically controls the contention window of the nodes experiencing collisions. During packet communication, the backoff counter is evenly selected from the given choice of [0, CW-1]. At this point, CW is recognized as contention window and its significance lies on the amount of unsuccessful transmission that had happened for the packet. On the initial transmission endeavour, CW is put to least amount value (C min), if transmission effort fails, subsequently the value gets doubled, and once more the value is set to least amount on victorious broadcast. CWMIDB is simulated inside NS2 environment and its performance is compared with Binary Exponential Backoff Algorithm. The simulation results show improvement in transmission probability compared to that of the existing backoff algorithm.

Keywords: backoff, contention window, CWMIDB, MANET

Procedia PDF Downloads 261
1426 Microseismicity of the Tehran Region Based on Three Seismic Networks

Authors: Jamileh Vasheghani Farahani

Abstract:

The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).

Keywords: Iran, major faults, microseismicity, Tehran

Procedia PDF Downloads 356
1425 High Thrust Upper Stage Solar Hydrogen Rocket Design

Authors: Maged Assem Soliman Mossallam

Abstract:

The conversion of solar thruster model to an upper stage hydrogen rocket is considered. Solar thruster categorization limits its capabilities to low and moderate thrust system with high specific impulse. The current study proposes a different concept for such systems by increasing the thrust which enables using as an upper stage rocket and for future launching purposes. A computational model for the thruster is discussed for solar thruster subsystems. The first module depends on ray tracing technique to determine the intercepted solar power by the hydrogen combustion chamber. The cavity receiver is modeled using finite volume technique. The final module imports the heated hydrogen properties to the nozzle using quasi one dimensional simulation. The probability of shock waves formulation inside the nozzle is almost diminished as the outlet pressure in space environment tends to zero. The computational model relates the high thrust hydrogen rocket conversion to the design parameters and operating conditions of the thruster. Three different designs for solar thruster systems are discussed. The first design is a low thrust high specific impulse design that produces about 10 Newton of thrust .The second one output thrust is about 250 Newton and the third design produces about 1000 Newton.

Keywords: space propulsion, hydrogen rocket, thrust, specific impulse

Procedia PDF Downloads 152
1424 Neural Networks and Genetic Algorithms Approach for Word Correction and Prediction

Authors: Rodrigo S. Fonseca, Antônio C. P. Veiga

Abstract:

Aiming at helping people with some movement limitation that makes typing and communication difficult, there is a need to customize an assistive tool with a learning environment that helps the user in order to optimize text input, identifying the error and providing the correction and possibilities of choice in the Portuguese language. The work presents an Orthographic and Grammatical System that can be incorporated into writing environments, improving and facilitating the use of an alphanumeric keyboard, using a prototype built using a genetic algorithm in addition to carrying out the prediction, which can occur based on the quantity and position of the inserted letters and even placement in the sentence, ensuring the sequence of ideas using a Long Short Term Memory (LSTM) neural network. The prototype optimizes data entry, being a component of assistive technology for the textual formulation, detecting errors, seeking solutions and informing the user of accurate predictions quickly and effectively through machine learning.

Keywords: genetic algorithm, neural networks, word prediction, machine learning

Procedia PDF Downloads 178
1423 Application of Artificial Neural Network for Prediction of High Tensile Steel Strands in Post-Tensioned Slabs

Authors: Gaurav Sancheti

Abstract:

This study presents an impacting approach of Artificial Neural Networks (ANNs) in determining the quantity of High Tensile Steel (HTS) strands required in post-tensioned (PT) slabs. Various PT slab configurations were generated by varying the span and depth of the slab. For each of these slab configurations, quantity of required HTS strands were recorded. ANNs with backpropagation algorithm and varying architectures were developed and their performance was evaluated in terms of Mean Square Error (MSE). The recorded data for the quantity of HTS strands was used as a feeder database for training the developed ANNs. The networks were validated using various validation techniques. The results show that the proposed ANNs have a great potential with good prediction and generalization capability.

Keywords: artificial neural networks, back propagation, conceptual design, high tensile steel strands, post tensioned slabs, validation techniques

Procedia PDF Downloads 206
1422 Predicting Global Solar Radiation Using Recurrent Neural Networks and Climatological Parameters

Authors: Rami El-Hajj Mohamad, Mahmoud Skafi, Ali Massoud Haidar

Abstract:

Several meteorological parameters were used for the prediction of monthly average daily global solar radiation on horizontal using recurrent neural networks (RNNs). Climatological data and measures, mainly air temperature, humidity, sunshine duration, and wind speed between 1995 and 2007 were used to design and validate a feed forward and recurrent neural network based prediction systems. In this paper we present our reference system based on a feed-forward multilayer perceptron (MLP) as well as the proposed approach based on an RNN model. The obtained results were promising and comparable to those obtained by other existing empirical and neural models. The experimental results showed the advantage of RNNs over simple MLPs when we deal with time series solar radiation predictions based on daily climatological data.

Keywords: recurrent neural networks, global solar radiation, multi-layer perceptron, gradient, root mean square error

Procedia PDF Downloads 428
1421 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 131
1420 Seismic Hazard Analysis for a Multi Layer Fault System: Antalya (SW Turkey) Example

Authors: Nihat Dipova, Bulent Cangir

Abstract:

This article presents the results of probabilistic seismic hazard analysis (PSHA) for Antalya (SW Turkey). South west of Turkey is characterized by large earthquakes resulting from the continental collision between the African, Arabian and Eurasian plates and crustal faults. Earthquakes around the study area are grouped into two; crustal earthquakes (D=0-50 km) and subduction zone earthquakes (50-140 km). Maximum observed magnitude of subduction earthquakes is Mw=6.0. Maximum magnitude of crustal earthquakes is Mw=6.6. Sources for crustal earthquakes are faults which are related with Isparta Angle and Cyprus Arc tectonic structures. A new earthquake catalogue for Antalya, with unified moment magnitude scale has been prepared and seismicity of the area around Antalya city has been evaluated by defining ‘a’ and ‘b’ parameters of the Gutenberg-Richter recurrence relationship. The Standard Cornell-McGuire method has been used for hazard computation utilizing CRISIS2007 software. Attenuation relationships proposed by Chiou and Youngs (2008) has been used for 0-50 km earthquakes and Youngs et. al (1997) for deep subduction earthquakes. Finally, Seismic hazard map for peak horizontal acceleration on a uniform site condition of firm rock (average shear wave velocity of about 1130 m/s) at a hazard level of 10% probability of exceedance in 50 years has been prepared.

Keywords: Antalya, peak ground acceleration, seismic hazard assessment, subduction

Procedia PDF Downloads 363
1419 High-Tech Based Simulation and Analysis of Maximum Power Point in Energy System: A Case Study Using IT Based Software Involving Regression Analysis

Authors: Enemeri George Uweiyohowo

Abstract:

Improved achievement with respect to output control of photovoltaic (PV) systems is one of the major focus of PV in recent times. This is evident to its low carbon emission and efficiency. Power failure or outage from commercial providers, in general, does not promote development to public and private sector, these basically limit the development of industries. The need for a well-structured PV system is of importance for an efficient and cost-effective monitoring system. The purpose of this paper is to validate the maximum power point of an off-grid PV system taking into consideration the most effective tilt and orientation angles for PV's in the southern hemisphere. This paper is based on analyzing the system using a solar charger with MPPT from a pulse width modulation (PWM) perspective. The power conditioning device chosen is a solar charger with MPPT. The practical setup consists of a PV panel that is set to an orientation angle of 0∘N, with a corresponding tilt angle of 36∘, 26∘ and 16∘. Preliminary results include regression analysis (normal probability plot) showing the maximum power point in the system as well the best tilt angle for maximum power point tracking.

Keywords: poly-crystalline PV panels, information technology (IT), maximum power point tracking (MPPT), pulse width modulation (PWM)

Procedia PDF Downloads 194
1418 Poster : Incident Signals Estimation Based on a Modified MCA Learning Algorithm

Authors: Rashid Ahmed , John N. Avaritsiotis

Abstract:

Many signal subspace-based approaches have already been proposed for determining the fixed Direction of Arrival (DOA) of plane waves impinging on an array of sensors. Two procedures for DOA estimation based neural networks are presented. First, Principal Component Analysis (PCA) is employed to extract the maximum eigenvalue and eigenvector from signal subspace to estimate DOA. Second, minor component analysis (MCA) is a statistical method of extracting the eigenvector associated with the smallest eigenvalue of the covariance matrix. In this paper, we will modify a Minor Component Analysis (MCA(R)) learning algorithm to enhance the convergence, where a convergence is essential for MCA algorithm towards practical applications. The learning rate parameter is also presented, which ensures fast convergence of the algorithm, because it has direct effect on the convergence of the weight vector and the error level is affected by this value. MCA is performed to determine the estimated DOA. Preliminary results will be furnished to illustrate the convergences results achieved.

Keywords: Direction of Arrival, neural networks, Principle Component Analysis, Minor Component Analysis

Procedia PDF Downloads 435
1417 Sensor Fault-Tolerant Model Predictive Control for Linear Parameter Varying Systems

Authors: Yushuai Wang, Feng Xu, Junbo Tan, Xueqian Wang, Bin Liang

Abstract:

In this paper, a sensor fault-tolerant control (FTC) scheme using robust model predictive control (RMPC) and set theoretic fault detection and isolation (FDI) is extended to linear parameter varying (LPV) systems. First, a group of set-valued observers are designed for passive fault detection (FD) and the observer gains are obtained through minimizing the size of invariant set of state estimation-error dynamics. Second, an input set for fault isolation (FI) is designed offline through set theory for actively isolating faults after FD. Third, an RMPC controller based on state estimation for LPV systems is designed to control the system in the presence of disturbance and measurement noise and tolerate faults. Besides, an FTC algorithm is proposed to maintain the plant operate in the corresponding mode when the fault occurs. Finally, a numerical example is used to show the effectiveness of the proposed results.

Keywords: fault detection, linear parameter varying, model predictive control, set theory

Procedia PDF Downloads 231