Search results for: generalized mathematical model.
4866 Implementation of Quantum Rotation Gates Using Controlled Non-Adiabatic Evolutions
Authors: Abdelrahman A. H. Abdelrahim, Gharib Subhi Mahmoud, Sherzod Turaev, Azeddine Messikh
Abstract:
Quantum gates are the basic building blocks in the quantum circuits model. These gates can be implemented using adiabatic or non adiabatic processes. Adiabatic models can be controlled using auxiliary qubits, whereas non adiabatic models can be simplified by using one single-shot implementation. In this paper, the controlled adiabatic evolutions is combined with the single-shot implementation to obtain quantum gates with controlled non adiabatic evolutions. This is an important improvement which can speed the implementation of quantum gates and reduce the errors due to the long run in the adiabatic model. The robustness of our scheme to different types of errors is also investigated.Keywords: Adiabatic evolutions, non adiabatic evolutions, controlled adiabatic evolutions, quantum rotation gates, dephasing rates, master equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11814865 The Importance of Class Attendance and Cumulative GPA for Academic Success in Industrial Engineering Classes
Authors: Suleiman Obeidat, Adnan Bashir, Wisam Abu Jadayil
Abstract:
The affect of the attendance percentage, the overall GPA and the number of credit hours the student is enrolled in at specific semester on the grade attained in specific course has been studied. This study has been performed on three courses offered in industrial engineering department at the Hashemite University in Jordan. Study has revealed that the grade attained by a student is strongly affected by the attendance percentage and his overall GPA with a value of R2 of 52.5%. Another model that has been investigated is the relation between the semester GPA and the attendance percentage, the number of credit hours enrolled in at specific semester, and the overall GPA. This model gave us a strong relationship between the semester GPA and attendance percentage and the overall GPA with a value of R2 of 76.2%.Keywords: Attendance in classes, GPA, Industrial Engineering, Grade
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36074864 Social Business Models: When Profits and Impacts Are Not at Odds
Authors: Elisa Pautasso, Matteo Castagno, Michele Osella
Abstract:
In the last decade the emergence of new social needs as an effect of the economic crisis has stimulated the flourishing of business endeavours characterised by explicit social goals. Social start-ups, social enterprises or Corporate Social Responsibility operations carried out by traditional companies are quintessential examples in this regard. This paper analyses these kinds of initiatives in order to discover the main characteristics of social business models and to provide insights to social entrepreneurs for developing or improving their strategies. The research is conducted through the integration of literature review and case study analysis and, thanks to the recognition of the importance of both profits and social impacts as the key success factors for a social business model, proposes a framework for identifying indicators suitable for measuring the social impacts generated.Keywords: Business model, case study, impacts, social business.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18154863 Zero Dimensional Simulation of Combustion Process of a DI Diesel Engine Fuelled With Biofuels
Authors: Donepudi Jagadish, Ravi Kumar Puli, K. Madhu Murthy
Abstract:
A zero dimensional model has been used to investigate the combustion performance of a single cylinder direct injection diesel engine fueled by biofuels with options like supercharging and exhaust gas recirculation. The numerical simulation was performed at constant speed. The indicated pressure, temperature diagrams are plotted and compared for different fuels. The emissions of soot and nitrous oxide are computed with phenomenological models. The experimental work was also carried out with biodiesel (palm stearin methyl ester) diesel blends, ethanol diesel blends to validate simulation results with experimental results, and observed that the present model is successful in predicting the engine performance with biofuels.Keywords: Biofuels Zero Dimensional Modeling, EnginePerformance, Engine Emissions
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42584862 A New Approach for Classifying Large Number of Mixed Variables
Authors: Hashibah Hamid
Abstract:
The issue of classifying objects into one of predefined groups when the measured variables are mixed with different types of variables has been part of interest among statisticians in many years. Some methods for dealing with such situation have been introduced that include parametric, semi-parametric and nonparametric approaches. This paper attempts to discuss on a problem in classifying a data when the number of measured mixed variables is larger than the size of the sample. A propose idea that integrates a dimensionality reduction technique via principal component analysis and a discriminant function based on the location model is discussed. The study aims in offering practitioners another potential tool in a classification problem that is possible to be considered when the observed variables are mixed and too large.Keywords: classification, location model, mixed variables, principal component analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15664861 Bioprocess Optimization Based On Relevance Vector Regression Models and Evolutionary Programming Technique
Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte
Abstract:
This paper proposes a bioprocess optimization procedure based on Relevance Vector Regression models and evolutionary programming technique. Relevance Vector Regression scheme allows developing a compact and stable data-based process model avoiding time-consuming modeling expenses. The model building and process optimization procedure could be done in a half-automated way and repeated after every new cultivation run. The proposed technique was tested in a simulated mammalian cell cultivation process. The obtained results are promising and could be attractive for optimization of industrial bioprocesses.
Keywords: Bioprocess optimization, Evolutionary programming, Relevance Vector Regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22094860 DWM-CDD: Dynamic Weighted Majority Concept Drift Detection for Spam Mail Filtering
Authors: Leili Nosrati, Alireza Nemaney Pour
Abstract:
Although e-mail is the most efficient and popular communication method, unwanted and mass unsolicited e-mails, also called spam mail, endanger the existence of the mail system. This paper proposes a new algorithm called Dynamic Weighted Majority Concept Drift Detection (DWM-CDD) for content-based filtering. The design purposes of DWM-CDD are first to accurate the performance of the previously proposed algorithms, and second to speed up the time to construct the model. The results show that DWM-CDD can detect both sudden and gradual changes quickly and accurately. Moreover, the time needed for model construction is less than previously proposed algorithms.
Keywords: Concept drift, Content-based filtering, E-mail, Spammail.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19714859 Memory and Higher Cognition
Authors: A. Páchová
Abstract:
Working memory (WM) can be defined as the system which actively holds information in the mind to do tasks in spite of the distraction. Contrary, short-term memory (STM) is a system that represents the capacity for the active storing of information without distraction. There has been accumulating evidence that these types of memory are related to higher cognition (HC). The aim of this study was to verify the relationship between HC and memory (visual STM and WM, auditory STM and WM). 59 primary school children were tested by intelligence test, mathematical tasks (HC) and memory subtests. We have shown that visual but not auditory memory is a significant predictor of higher cognition. The relevance of these results are discussed.Keywords: higher cognition, long-term memory, short-term memory, working memory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15594858 Research of Dynamics Picking Mechanism of Sulzer Projectile Loom
Authors: A. Jomartov, K. Jomartova
Abstract:
One of the main and responsible units of Sulzer projectile loom is picking mechanism. It is specifically designed to accelerate projectile to speed of 25 m / s. Initial speed projectile of Sulzer projectile loom is independent of speed loom and determined the potential energy torsion rod. This paper investigates the dynamics picking mechanism of Sulzer projectile loom during its discharge. A result of calculation model, we obtain the law of motion lever of picking mechanism during its discharge. Construction of dynamic model the picking mechanism of Sulzer projectile loom on software complex SimulationX can make calculations for different thickness of torsion rods taking into account the backlashes in the connections, the dissipative forces and resistance forcesKeywords: Dynamics, loom, picking mechanism, projectile, SimulationX.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35974857 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework
Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim
Abstract:
Background modeling and subtraction in video analysis has been widely used as an effective method for moving objects detection in many computer vision applications. Recently, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are the most frequently occurred problems in the practical situation. This paper presents a favorable two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean value of each RGB color channel. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the output of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate very competitive performance compared to previous models.Keywords: Background subtraction, codebook model, local binary pattern, dynamic background, illumination changes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19724856 Analysis of Surface Hardness, Surface Roughness, and Near Surface Microstructure of AISI 4140 Steel Worked with Turn-Assisted Deep Cold Rolling Process
Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma, K. Jagannath, Achutha Kini U.
Abstract:
In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the surface hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor hobson talysurf tester, micro vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.
Keywords: Surface hardness, response surface methodology, microstructure, central composite design, deep cold rolling, surface roughness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18124855 Developing Optical Sensors with Application of Cancer Detection by Elastic Light Scattering Spectroscopy
Authors: May Fadheel Estephan, Richard Perks
Abstract:
Cancer is a serious health concern that affects millions of people worldwide. Early detection and treatment are essential for improving patient outcomes. However, current methods for cancer detection have limitations, such as low sensitivity and specificity. The aim of this study was to develop an optical sensor for cancer detection using elastic light scattering spectroscopy (ELSS). ELSS is a non-invasive optical technique that can be used to characterize the size and concentration of particles in a solution. An optical probe was fabricated with a 100-μm-diameter core and a 132-μm centre-to-centre separation. The probe was used to measure the ELSS spectra of polystyrene spheres with diameters of 2 μm, 0.8 μm, and 0.413 μm. The spectra were then analysed to determine the size and concentration of the spheres. The results showed that the optical probe was able to differentiate between the three different sizes of polystyrene spheres. The probe was also able to detect the presence of polystyrene spheres in suspension concentrations as low as 0.01%. The results of this study demonstrate the potential of ELSS for cancer detection. ELSS is a non-invasive technique that can be used to characterize the size and concentration of cells in a tissue sample. This information can be used to identify cancer cells and assess the stage of the disease. The data for this study were collected by measuring the ELSS spectra of polystyrene spheres with different diameters. The spectra were collected using a spectrometer and a computer. The ELSS spectra were analysed using a software program to determine the size and concentration of the spheres. The software program used a mathematical algorithm to fit the spectra to a theoretical model. The question addressed by this study was whether ELSS could be used to detect cancer cells. The results of the study showed that ELSS could be used to differentiate between different sizes of cells, suggesting that it could be used to detect cancer cells. The findings of this research show the utility of ELSS in the early identification of cancer. ELSS is a non-invasive method for characterizing the number and size of cells in a tissue sample. To determine cancer cells and determine the disease's stage, this information can be employed. Further research is needed to evaluate the clinical performance of ELSS for cancer detection.
Keywords: Elastic Light Scattering Spectroscopy, Polystyrene spheres in suspension, optical probe, fibre optics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714854 Simulation Study of Radial Heat and Mass Transfer Inside a Fixed Bed Catalytic Reactor
Authors: K. Vakhshouri, M.M. Y. Motamed Hashemi
Abstract:
A rigorous two-dimensional model is developed for simulating the operation of a less-investigated type steam reformer having a considerably lower operating Reynolds number, higher tube diameter, and non-availability of extra steam in the feed compared with conventional steam reformers. Simulation results show that reasonable predictions can only be achieved when certain correlations for wall to fluid heat transfer equations are applied. Due to severe operating conditions, in all cases, strong radial temperature gradients inside the reformer tubes have been found. Furthermore, the results show how a certain catalyst loading profile will affect the operation of the reformer.
Keywords: Steam reforming, direct reduction, heat transfer, two-dimensional model, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36504853 Continuous Feature Adaptation for Non-Native Speech Recognition
Authors: Y. Deng, X. Li, C. Kwan, B. Raj, R. Stern
Abstract:
The current speech interfaces in many military applications may be adequate for native speakers. However, the recognition rate drops quite a lot for non-native speakers (people with foreign accents). This is mainly because the nonnative speakers have large temporal and intra-phoneme variations when they pronounce the same words. This problem is also complicated by the presence of large environmental noise such as tank noise, helicopter noise, etc. In this paper, we proposed a novel continuous acoustic feature adaptation algorithm for on-line accent and environmental adaptation. Implemented by incremental singular value decomposition (SVD), the algorithm captures local acoustic variation and runs in real-time. This feature-based adaptation method is then integrated with conventional model-based maximum likelihood linear regression (MLLR) algorithm. Extensive experiments have been performed on the NATO non-native speech corpus with baseline acoustic model trained on native American English. The proposed feature-based adaptation algorithm improved the average recognition accuracy by 15%, while the MLLR model based adaptation achieved 11% improvement. The corresponding word error rate (WER) reduction was 25.8% and 2.73%, as compared to that without adaptation. The combined adaptation achieved overall recognition accuracy improvement of 29.5%, and WER reduction of 31.8%, as compared to that without adaptation. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32234852 Modelling the Occurrence of Defects and Change Requests during User Acceptance Testing
Authors: Kevin McDaid, Simon P. Wilson
Abstract:
Software developed for a specific customer under contract typically undergoes a period of testing by the customer before acceptance. This is known as user acceptance testing and the process can reveal both defects in the system and requests for changes to the product. This paper uses nonhomogeneous Poisson processes to model a real user acceptance data set from a recently developed system. In particular a split Poisson process is shown to provide an excellent fit to the data. The paper explains how this model can be used to aid the allocation of resources through the accurate prediction of occurrences both during the acceptance testing phase and before this activity begins. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23494851 Using RASCAL Code to Analyze the Postulated UF6 Fire Accident
Authors: J. R. Wang, Y. Chiang, W. S. Hsu, S. H. Chen, J. H. Yang, S. W. Chen, C. Shih, Y. F. Chang, Y. H. Huang, B. R. Shen
Abstract:
In this research, the RASCAL code was used to simulate and analyze the postulated UF6 fire accident which may occur in the Institute of Nuclear Energy Research (INER). There are four main steps in this research. In the first step, the UF6 data of INER were collected. In the second step, the RASCAL analysis methodology and model was established by using these data. Third, this RASCAL model was used to perform the simulation and analysis of the postulated UF6 fire accident. Three cases were simulated and analyzed in this step. Finally, the analysis results of RASCAL were compared with the hazardous levels of the chemicals. According to the compared results of three cases, Case 3 has the maximum danger in human health.
Keywords: RASCAL, UF6, Safety, Hydrogen fluoride.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8814850 Service Architecture for 3rd Party Operator's Participation
Authors: F. Sarabchi, A. H. Darvishan, H. Yeganeh, H. Ahmadian
Abstract:
Next generation networks with the idea of convergence of service and control layer in existing networks (fixed, mobile and data) and with the intention of providing services in an integrated network, has opened new horizon for telecom operators. On the other hand, economic problems have caused operators to look for new source of income including consider new services, subscription of more users and their promotion in using morenetwork resources and easy participation of service providers or 3rd party operators in utilizing networks. With this requirement, an architecture based on next generation objectives for service layer is necessary. In this paper, a new architecture based on IMS model explains participation of 3rd party operators in creation and implementation of services on an integrated telecom network.
Keywords: Service model, IMS, API, Scripting language, JAIN, Parlay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14844849 The Impact of Knowledge Sharing on Innovation Capability in United Arab Emirates Organizations
Authors: S. Abdallah, A. Khalil, A. Divine
Abstract:
The purpose of this study was to explore the relationship between knowledge sharing and innovation capability, by examining the influence of individual, organizational and technological factors on knowledge sharing. The research is based on a survey of 103 employees from different organizations in the United Arab Emirates. The study is based on a model and a questionnaire that was previously tested by Lin [1]. Thus, the study aims at examining the validity of that model in UAE context. The results of the research show varying degrees of correlation between the different variables, with ICT use having the strongest relationship with the innovation capabilities of organizations. The study also revealed little evidence of knowledge collecting and knowledge sharing among UAE employees.Keywords: Knowledge sharing, Organization Innovation, Technology Use, Innovation Capabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25194848 Two New Relative Efficiencies of Linear Weighted Regression
Authors: Shuimiao Wan, Chao Yuan, Baoguang Tian
Abstract:
In statistics parameter theory, usually the parameter estimations have two kinds, one is the least-square estimation (LSE), and the other is the best linear unbiased estimation (BLUE). Due to the determining theorem of minimum variance unbiased estimator (MVUE), the parameter estimation of BLUE in linear model is most ideal. But since the calculations are complicated or the covariance is not given, people are hardly to get the solution. Therefore, people prefer to use LSE rather than BLUE. And this substitution will take some losses. To quantize the losses, many scholars have presented many kinds of different relative efficiencies in different views. For the linear weighted regression model, this paper discusses the relative efficiencies of LSE of β to BLUE of β. It also defines two new relative efficiencies and gives their lower bounds.Keywords: Linear weighted regression, Relative efficiency, Lower bound, Parameter estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21264847 A Fuzzy Nonlinear Regression Model for Interval Type-2 Fuzzy Sets
Authors: O. Poleshchuk, E.Komarov
Abstract:
This paper presents a regression model for interval type-2 fuzzy sets based on the least squares estimation technique. Unknown coefficients are assumed to be triangular fuzzy numbers. The basic idea is to determine aggregation intervals for type-1 fuzzy sets, membership functions of whose are low membership function and upper membership function of interval type-2 fuzzy set. These aggregation intervals were called weighted intervals. Low and upper membership functions of input and output interval type-2 fuzzy sets for developed regression models are considered as piecewise linear functions.
Keywords: Interval type-2 fuzzy sets, fuzzy regression, weighted interval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22354846 Soft Connected Spaces and Soft Paracompact Spaces
Authors: Fucai Lin
Abstract:
Soft topological spaces are considered as mathematical tools for dealing with uncertainties, and a fuzzy topological space is a special case of the soft topological space. The purpose of this paper is to study soft topological spaces. We introduce some new concepts in soft topological spaces such as soft closed mapping, soft open mappings, soft connected spaces and soft paracompact spaces. We also redefine the concept of soft points such that it is reasonable in soft topological spaces. Moreover, some basic properties of these concepts are explored.
Keywords: soft sets, soft open mappings, soft closed mappings, soft connected spaces, soft paracompact spaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20614845 A Description Logics Based Approach for Building Multi-Viewpoints Ontologies
Authors: M. Hemam, M. Djezzar, T. Djouad
Abstract:
We are interested in the problem of building an ontology in a heterogeneous organization, by taking into account different viewpoints and different terminologies of communities in the organization. Such ontology, that we call multi-viewpoint ontology, confers to the same universe of discourse, several partial descriptions, where each one is relative to a particular viewpoint. In addition, these partial descriptions share at global level, ontological elements constituent a consensus between the various viewpoints. In order to provide response elements to this problem we define a multi-viewpoints knowledge model based on viewpoint and ontology notions. The multi-viewpoints knowledge model is used to formalize the multi-viewpoints ontology in description logics language.Keywords: Description logic, knowledge engineering, ontology, viewpoint.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10344844 Optimization of Structure of Section-Based Automated Lines
Authors: R. Usubamatov, M. Z. Abdulmuin
Abstract:
Automated production lines with so called 'hard structures' are widely used in manufacturing. Designers segmented these lines into sections by placing a buffer between the series of machine tools to increase productivity. In real production condition the capacity of a buffer system is limited and real production line can compensate only some part of the productivity losses of an automated line. The productivity of such production lines cannot be readily determined. This paper presents mathematical approach to solving the structure of section-based automated production lines by criterion of maximum productivity.
Keywords: optimization production line, productivity, sections
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13294843 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution
Authors: Saleem Z. Ramadan
Abstract:
This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the Pth percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.
Keywords: Reliability, Accelerated life testing, Cumulative exposure model, Bayesian estimation, Progressive Type-I censoring, Weibull distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21724842 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: River stage-discharge process, LSSVM, discrete wavelet transform (DWT), ensemble empirical decomposition mode (EEMD), multi-station modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6724841 Corruption, Economic Growth, and Income Inequality: Evidence from Ten Countries in Asia
Authors: Chiung-Ju Huang
Abstract:
This study utilizes the panel vector error correction model (PVECM) to examine the relationship among corruption, economic growth, and income inequality experienced within ten Asian countries over the 1995 to 2010 period. According to the empirical results, we do not support the common perception that corruption decreases economic growth. On the contrary, we found that corruption increases economic growth. Meanwhile, an increase in economic growth will cause an increase in income inequality, although the effect is insignificant. Similarly, an increase in income inequality will cause an increase in economic growth but a decrease in corruption, although the effect is also insignificant.Keywords: Corruption, economic growth, income inequality, panel vector error correction model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33794840 Solar Radiation Time Series Prediction
Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs
Abstract:
A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled direct normal irradiance field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.
Keywords: Artificial Neural Networks, Resilient Propagation, Solar Radiation, Time Series Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27734839 The Boundary Theory between Laminar and Turbulent Flows
Authors: Tomasz M. Jankowski
Abstract:
The basis of this paper is the assumption, that graviton is a measurable entity of molecular gravitational acceleration and this is not a hypothetical entity. The adoption of this assumption as an axiom is tantamount to fully opening the previously locked door to the boundary theory between laminar and turbulent flows. It leads to the theorem, that the division of flows of Newtonian (viscous) fluids into laminar and turbulent is true only, if the fluid is influenced by a powerful, external force field. The mathematical interpretation of this theorem, presented in this paper shows, that the boundary between laminar and turbulent flow can be determined theoretically. This is a novelty, because thus far the said boundary was determined empirically only and the reasons for its existence were unknown.Keywords: Freed gravitons, free gravitons.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14724838 A Comparative Study of Standard, Casted and Riveted Eye Design of a Mono Leaf Spring Using CAE Tools
Authors: Gian Bhushan, Vinkel Arora, M.L. Aggarwal
Abstract:
The objective of the present study is to determine better eye end design of a mono leaf spring used in light motor vehicle. A conventional 65Si7 spring steel leaf spring model with standard eye, casted and riveted eye end are considered. The CAD model of the leaf springs is prepared in CATIA and analyzed using ANSYS. The standard eye, casted and riveted eye leaf springs are subjected to similar loading conditions. The CAE analysis of the leaf spring is performed for various parameters like deflection and Von- Mises stress. Mass reduction of 62.9% is achieved in case of riveted eye mono leaf spring as compared to standard eye mono leaf spring for the same loading conditions.
Keywords: CAE, Leaf Spring, 65Si7 spring steel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23864837 An Agent Based Simulation for Network Formation with Heterogeneous Agents
Authors: Hisashi Kojima, Masatora Daito
Abstract:
We investigate an asymmetric connections model with a dynamic network formation process, using an agent based simulation. We permit heterogeneity of agents- value. Valuable persons seem to have many links on real social networks. We focus on this point of view, and examine whether valuable agents change the structures of the terminal networks. Simulation reveals that valuable agents diversify the terminal networks. We can not find evidence that valuable agents increase the possibility that star networks survive the dynamic process. We find that valuable agents disperse the degrees of agents in each terminal network on an average.Keywords: network formation, agent based simulation, connections model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1293