Search results for: phase error accumulation methodology
2652 A Multimodal Approach for Biometric Authentication with Multiple Classifiers
Authors: Sorin Soviany, Cristina Soviany, Mariana Jurian
Abstract:
The paper presents a multimodal approach for biometric authentication, based on multiple classifiers. The proposed solution uses a post-classification biometric fusion method in which the biometric data classifiers outputs are combined in order to improve the overall biometric system performance by decreasing the classification error rates. The paper shows also the biometric recognition task improvement by means of a carefully feature selection, as much as not all of the feature vectors components support the accuracy improvement.
Keywords: biometric fusion, multiple classifiers
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20832651 Development of a Technology Assessment Model by Patents and Customers' Review Data
Authors: Kisik Song, Sungjoo Lee
Abstract:
Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.Keywords: Technology assessment, patents, citation information, opinion mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9922650 Distribution of Phospholipids, Cholesterol and Carotenoids in Two-Solvent System during Egg Yolk Oil Solvent Extraction
Authors: Aleksandrs Kovalcuks, Mara Duma
Abstract:
Egg yolk oil is a concentrated source of egg bioactive compounds, such as fat-soluble vitamins, phospholipids, cholesterol, carotenoids and others. To extract lipids and other fat-soluble nutrients from liquid egg yolk, a two-step extraction process involving polar (ethanol) and non-polar (hexane) solvents were used. This extraction technique was based on egg yolk bioactive compounds polarities, where non-polar compound was extracted into non-polar hexane, but polar in to polar alcohol/water phase. But many egg yolk bioactive compounds are not strongly polar or non-polar. Egg yolk phospholipids, cholesterol and pigments are amphipatic (have both polar and non-polar regions) and their behavior in ethanol/hexane solvent system is not clear. The aim of this study was to clarify the behavior of phospholipids, cholesterol and carotenoids during extraction of egg yolk oil with ethanol and hexane and determine the loss of these compounds in egg yolk oil. Egg yolks and egg yolk oil were analyzed for phospholipids (phosphatidylcholine (PC) and phosphatidylethanolamine (PE)), cholesterol and carotenoids (lutein, zeaxanthin, canthaxanthin and β-carotene) content using GC-FID and HPLC methods. PC and PE are polar lipids and were extracted into polar ethanol phase. Concentration of PC in ethanol was 97.89% and PE 99.81% from total egg yolk phospholipids. Due to cholesterol’s partial extraction into ethanol, cholesterol content in egg yolk oil was reduced in comparison to its total content presented in egg yolk lipids. The highest amount of lutein and zeaxanthin was concentrated in ethanol extract. The opposite situation was observed with canthaxanthin and β-carotene, which became the main pigments of egg yolk oil.
Keywords: Cholesterol, egg yolk oil, lutein, phospholipids, solvent extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18612649 Crystalline Structure of Starch Based Nano Composites
Authors: Farid Amidi Fazli, Afshin Babazadeh, Farnaz Amidi Fazli
Abstract:
In contrast with literal meaning of nano, researchers have been achieved mega adventures in this area and every day more nanomaterials are being introduced to the market. After long time application of fossil-based plastics, nowadays accumulation of their waste seems to be a big problem to the environment. On the other hand, mankind has more attention to safety and living environment. Replacing common plastic packaging materials with degradable ones that degrade faster and convert to non-dangerous components like water and carbon dioxide have more attractions; these new materials are based on renewable and inexpensive sources of starch and cellulose. However, the functional properties of them do not suitable for packaging. At this point, nanotechnology has an important role. Utilizing of nanomaterials in polymer structure will improve mechanical and physical properties of them; nanocrystalline cellulose (NCC) has this ability. This work has employed a chemical method to produce NCC and starch bio nanocomposite containing NCC. X-Ray Diffraction technique has characterized the obtained materials. Results showed that applied method is a suitable one as well as applicable one to NCC production.
Keywords: Biofilm, cellulose, nanocomposite, starch.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17712648 Effect of Domestic Treated Wastewater use on Three Varieties of Amaranth (Amaranthus spp.) under Semi Arid Conditions
Authors: El Youssfi L., Choukr-Allah R., Zaafrani M., Mediouni T., Sarr F, Hirich A.
Abstract:
An experiment was implemented in a filed in the south of Morocco to evaluate the effects of domestic treated wastewater use for irrigation of amaranth crop under semi-arid conditions. Three varieties (A0020, A0057 & A211) were tested and irrigated using domestic treated wastewater EC1 (0,92 dS/m) as control, EC3 (3dS/m) and EC6 (6dS/m) obtained by adding sea water. In term of growth, an increase of the EC level of applied irrigation water reduced significantly the plant-s height, leaf area, fresh and dry weight measured at vegetative, flowering and maturity stage for all varieties. Even with the application of the EC6, yields were relatively higher in comparison with the once obtained in normal cultivation conditions. A significant accumulation of nitrate, chloride and sodium in soil layers during the crop cycle was noted. The use of treated waste water for its irrigation is proved to be possible. The variety A211 had showed to be less sensitive to salinity stress and it could be more promising its introduction to study area.
Keywords: Amaranth, salinity, semi-arid, treated waste water.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20182647 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation
Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke
Abstract:
Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.Keywords: Automatic calibration framework, approximate Bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17402646 Adaptive Sliding Mode Observer for a Class of Systems
Abstract:
In this paper, the performance of two adaptive observers applied to interconnected systems is studied. The nonlinearity of systems can be written in a fractional form. The first adaptive observer is an adaptive sliding mode observer for a Lipchitz nonlinear system and the second one is an adaptive sliding mode observer having a filtered error as a sliding surface. After comparing their performances throughout the inverted pendulum mounted on a car system, it was shown that the second one is more robust to estimate the state.Keywords: Adaptive observer, Lipchitz system, Interconnected fractional nonlinear system, sliding mode.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16612645 A Study on Application of Elastic Theory for Computing Flexural Stresses in Preflex Beam
Authors: Nasiri Ahmadullah, Shimozato Tetsuhiro, Masayuki Tai
Abstract:
This paper presents the step-by-step procedure for using Elastic Theory to calculate the internal stresses in composite bridge girders prestressed by the Preflexing Technology, called Prebeam in Japan and Preflex beam worldwide. Elastic Theory approaches preflex beams the same way as it does the conventional composite girders. Since preflex beam undergoes different stages of construction, calculations are made using different sectional and material properties. Stresses are calculated in every stage using the properties of the specific section. Stress accumulation gives the available stress in a section of interest. Concrete presence in the section implies prestress loss due to creep and shrinkage, however; more work is required to be done in this field. In addition to the graphical presentation of this application, this paper further discusses important notes of graphical comparison between the results of an experimental-only research carried out on a preflex beam, with the results of simulation based on the elastic theory approach, for an identical beam using Finite Element Modeling (FEM) by the author.
Keywords: Composite girder, elastic theory, preflex beam, prestressing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9082644 Simulation and Analysis of Control System for a Solar Desalination System
Authors: R. Prakash, B. Meenakshipriya, R. Kumaravelan
Abstract:
Fresh water is one of the resources which is getting depleted day by day. A wise method to address this issue is by the application of renewable energy-sun irradiation and by means of decentralized, cheap, energetically self-sufficient, robust and simple to operate plants, distillates can be obtained from sea, river or even sewage. Solar desalination is a technique used to desalinate water using solar energy. The present work deals with the comprehensive design and simulation of solar tracking system using LabVIEW, temperature and mass flow rate control of the solar desalination plant using LabVIEW and also analysis of single phase inverter circuit with LC filters for solar pumping system in MATLAB. The main objective of this work is to improve the performance of solar desalination system using automatic tracking system, output control using temperature and mass flow rate control system and also to reduce the harmonic distortion in the solar pumping system by means of LC filters. The simulation of single phase inverter was carried out using MATLAB and the output waveforms were analyzed. Simulations were performed for optimum output temperature control, which in turn controls the mass flow rate of water in the thermal collectors. Solar tracking system was accomplished using LABVIEW and was tested successfully. The thermal collectors are tracked in accordance with the sun’s irradiance levels, thereby increasing the efficiency of the thermal collectors.Keywords: Desalination, Electro dialysis, LabVIEW, MATLAB, PWM inverter, Reverse osmosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23972643 Mechanical Properties of Powder Metallurgy Processed Biodegradable Zn-Based Alloy for Biomedical Application
Authors: Maruf Yinka Kolawole, Jacob Olayiwola Aweda, Farasat Iqbal, Asif Ali, Sulaiman Abdulkareem
Abstract:
Zinc is a non-ferrous metal with potential application in orthopaedic implant materials. However, its poor mechanical properties were major challenge to its application. Therefore, this paper studies the mechanical properties of biodegradable Zn-based alloy for biomedical application. Pure zinc powder with varying (0, 1, 2, 3 & 6) wt% of magnesium powders were ball milled using ball-to-powder ratio (B:P) of 10:1 at 350 rpm for 4 hours. The resulting milled powders were compacted and sintered at 300 MPa and 350 °C respectively. Microstructural, phase and mechanical properties analyses were performed following American standard of testing and measurement. The results show that magnesium has influence on the mechanical properties of zinc. The compressive strength, hardness and elastic modulus of 210 ± 8.878 MPa, 76 ± 5.707 HV and 45 ± 11.616 GPa respectively as obtained in Zn-2Mg alloy were optimum and meet the minimum requirement of biodegradable metal for orthopaedics application. These results indicate an increase of 111, 93 and 93% in compressive strength, hardness and elastic modulus respectively as compared to pure zinc. The increase in mechanical properties was adduced to effectiveness of compaction pressure and intermetallic phase formation within the matrix resulting in high dislocation density for improving strength. The study concluded that, Zn-2Mg alloy with optimum mechanical properties can therefore be considered a potential candidate for orthopaedic application.
Keywords: Biodegradable metal, biomedical application mechanical properties, powder metallurgy, zinc.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9922642 New Product-Type Estimators for the Population Mean Using Quartiles of the Auxiliary Variable
Authors: Amer Ibrahim Falah Al-Omari
Abstract:
In this paper, we suggest new product-type estimators for the population mean of the variable of interest exploiting the first or the third quartile of the auxiliary variable. We obtain mean square error equations and the bias for the estimators. We study the properties of these estimators using simple random sampling (SRS) and ranked set sampling (RSS) methods. It is found that, SRS and RSS produce approximately unbiased estimators of the population mean. However, the RSS estimators are more efficient than those obtained using SRS based on the same number of measured units for all values of the correlation coefficient.
Keywords: Product estimator, auxiliary variable, simple random sampling, extreme ranked set sampling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15312641 Affine Projection Adaptive Filter with Variable Regularization
Authors: Young-Seok Choi
Abstract:
We propose two affine projection algorithms (APA) with variable regularization parameter. The proposed algorithms dynamically update the regularization parameter that is fixed in the conventional regularized APA (R-APA) using a gradient descent based approach. By introducing the normalized gradient, the proposed algorithms give birth to an efficient and a robust update scheme for the regularization parameter. Through experiments we demonstrate that the proposed algorithms outperform conventional R-APA in terms of the convergence rate and the misadjustment error.Keywords: Affine projection, regularization, gradient descent, system identification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16092640 Design and Optimization for a Compliant Gripper with Force Regulation Mechanism
Authors: Nhat Linh Ho, Thanh-Phong Dao, Shyh-Chour Huang, Hieu Giang Le
Abstract:
This paper presents a design and optimization for a compliant gripper. The gripper is constructed based on the concept of compliant mechanism with flexure hinge. A passive force regulation mechanism is presented to control the grasping force a micro-sized object instead of using a sensor force. The force regulation mechanism is designed using the planar springs. The gripper is expected to obtain a large range of displacement to handle various sized objects. First of all, the statics and dynamics of the gripper are investigated by using the finite element analysis in ANSYS software. And then, the design parameters of the gripper are optimized via Taguchi method. An orthogonal array L9 is used to establish an experimental matrix. Subsequently, the signal to noise ratio is analyzed to find the optimal solution. Finally, the response surface methodology is employed to model the relationship between the design parameters and the output displacement of the gripper. The design of experiment method is then used to analyze the sensitivity so as to determine the effect of each parameter on the displacement. The results showed that the compliant gripper can move with a large displacement of 213.51 mm and the force regulation mechanism is expected to be used for high precision positioning systems.
Keywords: Flexure hinge, compliant mechanism, compliant gripper, force regulation mechanism, Taguchi method, response surface methodology, design of experiment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16132639 Increase of Peroxidase Activity of Haptoglobin (2-2)-Hemoglobin at Pathologic Temperature and Presence of Antibiotics
Authors: M Tayari, SZ Moosavi-nejad, A Shabani, M Rezaei Tavirani
Abstract:
Free Hemoglobin promotes the accumulation of hydroxyl radicals by the heme iron, which can react with endogenous hydrogen peroxide to produce free radicals which may cause severe oxidative cell damage. Haptoglobin binds to Hemoglobin strongly and Haptoglobin-Hemoglobin binding is irreversible. Peroxidase activity of Haptoglobin(2-2)-Hemoglobin complex was assayed by following increase of absorption of produced tetraguaiacol as the second substrate of Haptoglobin-Hemoglobin complex at 470 nm and 42°C by UV-Vis spectrophotometer. The results have shown that peroxidase activity of Haptoglobin(2-2)-Hemoglobin complex is modulated via homotropic effect of hydrogen peroxide as allostric substrate. On the other hand antioxidant property of Haptoglobin(2- 2)-Hemoglobin was increased via heterotropic effect of the two drugs (especially ampicillin) on peroxidase activity of the complex. Both drugs also have mild effect on quality of homotropic property of peroxidase activity of Haptoglobin(2-2)-Hemoglobin complex. Therefore, in vitro studies show that the two drugs may help Hp-Hb complex to remove hydrogen peroxide from serum at pathologic temperature ature (42 C).Keywords: Haptoglobin, Hemoglobin, Antioxidant, Antibiotics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22732638 Simulation and Assessment of Carbon Dioxide Separation by Piperazine Blended Solutions Using E-NRTL and Peng-Robinson Models: A Study of Regeneration Heat Duty
Authors: Arash Esmaeili, Zhibang Liu, Yang Xiang, Jimmy Yun, Lei Shao
Abstract:
High pressure carbon dioxide (CO2) absorption from a specific off-gas in a conventional column has been evaluated for the environmental concerns by the Aspen HYSYS simulator using a wide range of single absorbents and piperazine (PZ) blended solutions to estimate the outlet CO2 concentration, CO2 loading, reboiler power supply and regeneration heat duty to choose the most efficient solution in terms of CO2 removal and required heat duty. The property package, which is compatible with all applied solutions for the simulation in this study, estimates the properties based on electrolyte non-random two-liquid (E-NRTL) model for electrolyte thermodynamics and Peng-Robinson equation of state for vapor phase and liquid hydrocarbon phase properties. The results of the simulation indicate that PZ in addition to the mixture of PZ and monoethanolamine (MEA) demand the highest regeneration heat duty compared with other studied single and blended amine solutions respectively. The blended amine solutions with the lowest PZ concentrations (5wt% and 10wt%) were considered and compared to reduce the cost of process, among which the blended solution of 10wt%PZ+35wt%MDEA (methyldiethanolamine) was found as the most appropriate solution in terms of CO2 content in the outlet gas, rich-CO2 loading and regeneration heat duty.
Keywords: Absorption, amine solutions, Aspen HYSYS, CO2 loading, piperazine, regeneration heat duty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6152637 A Robust Audio Fingerprinting Algorithm in MP3 Compressed Domain
Authors: Ruili Zhou, Yuesheng Zhu
Abstract:
In this paper, a new robust audio fingerprinting algorithm in MP3 compressed domain is proposed with high robustness to time scale modification (TSM). Instead of simply employing short-term information of the MP3 stream, the new algorithm extracts the long-term features in MP3 compressed domain by using the modulation frequency analysis. Our experiment has demonstrated that the proposed method can achieve a hit rate of above 95% in audio retrieval and resist the attack of 20% TSM. It has lower bit error rate (BER) performance compared to the other algorithms. The proposed algorithm can also be used in other compressed domains, such as AAC.Keywords: Audio Fingerprinting, MP3, Modulation Frequency, TSM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21962636 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria
Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova
Abstract:
Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.Keywords: Cross-validation, decision tree, lagged variables, short-term forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7382635 Bayesian Belief Networks for Test Driven Development
Authors: Vijayalakshmy Periaswamy S., Kevin McDaid
Abstract:
Testing accounts for the major percentage of technical contribution in the software development process. Typically, it consumes more than 50 percent of the total cost of developing a piece of software. The selection of software tests is a very important activity within this process to ensure the software reliability requirements are met. Generally tests are run to achieve maximum coverage of the software code and very little attention is given to the achieved reliability of the software. Using an existing methodology, this paper describes how to use Bayesian Belief Networks (BBNs) to select unit tests based on their contribution to the reliability of the module under consideration. In particular the work examines how the approach can enhance test-first development by assessing the quality of test suites resulting from this development methodology and providing insight into additional tests that can significantly reduce the achieved reliability. In this way the method can produce an optimal selection of inputs and the order in which the tests are executed to maximize the software reliability. To illustrate this approach, a belief network is constructed for a modern software system incorporating the expert opinion, expressed through probabilities of the relative quality of the elements of the software, and the potential effectiveness of the software tests. The steps involved in constructing the Bayesian Network are explained as is a method to allow for the test suite resulting from test-driven development.Keywords: Software testing, Test Driven Development, Bayesian Belief Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18872634 The Enhancement of Target Localization Using Ship-Borne Electro-Optical Stabilized Platform
Authors: Jaehoon Ha, Byungmo Kang, Kilho Hong, Jungsoo Park
Abstract:
Electro-optical (EO) stabilized platforms have been widely used for surveillance and reconnaissance on various types of vehicles, from surface ships to unmanned air vehicles (UAVs). EO stabilized platforms usually consist of an assembly of structure, bearings, and motors called gimbals in which a gyroscope is installed. EO elements such as a CCD camera and IR camera, are mounted to a gimbal, which has a range of motion in elevation and azimuth and can designate and track a target. In addition, a laser range finder (LRF) can be added to the gimbal in order to acquire the precise slant range from the platform to the target. Recently, a versatile functionality of target localization is needed in order to cooperate with the weapon systems that are mounted on the same platform. The target information, such as its location or velocity, needed to be more accurate. The accuracy of the target information depends on diverse component errors and alignment errors of each component. Specially, the type of moving platform can affect the accuracy of the target information. In the case of flying platforms, or UAVs, the target location error can be increased with altitude so it is important to measure altitude as precisely as possible. In the case of surface ships, target location error can be increased with obliqueness of the elevation angle of the gimbal since the altitude of the EO stabilized platform is supposed to be relatively low. The farther the slant ranges from the surface ship to the target, the more extreme the obliqueness of the elevation angle. This can hamper the precise acquisition of the target information. So far, there have been many studies on EO stabilized platforms of flying vehicles. However, few researchers have focused on ship-borne EO stabilized platforms of the surface ship. In this paper, we deal with a target localization method when an EO stabilized platform is located on the mast of a surface ship. Especially, we need to overcome the limitation caused by the obliqueness of the elevation angle of the gimbal. We introduce a well-known approach for target localization using Unscented Kalman Filter (UKF) and present the problem definition showing the above-mentioned limitation. Finally, we want to show the effectiveness of the approach that will be demonstrated through computer simulations.
Keywords: Target localization, ship-borne electro-optical stabilized platform, unscented Kalman filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11142633 Network Reconfiguration of Distribution System Using Artificial Bee Colony Algorithm
Authors: S. Ganesh
Abstract:
Power distribution systems typically have tie and sectionalizing switches whose states determine the topological configuration of the network. The aim of network reconfiguration of the distribution network is to minimize the losses for a load arrangement at a particular time. Thus the objective function is to minimize the losses of the network by satisfying the distribution network constraints. The various constraints are radiality, voltage limits and the power balance condition. In this paper the status of the switches is obtained by using Artificial Bee Colony (ABC) algorithm. ABC is based on a particular intelligent behavior of honeybee swarms. ABC is developed based on inspecting the behaviors of real bees to find nectar and sharing the information of food sources to the bees in the hive. The proposed methodology has three stages. In stage one ABC is used to find the tie switches, in stage two the identified tie switches are checked for radiality constraint and if the radilaity constraint is satisfied then the procedure is proceeded to stage three otherwise the process is repeated. In stage three load flow analysis is performed. The process is repeated till the losses are minimized. The ABC is implemented to find the power flow path and the Forward Sweeper algorithm is used to calculate the power flow parameters. The proposed methodology is applied for a 33–bus single feeder distribution network using MATLAB.
Keywords: Artificial Bee Colony (ABC) algorithm, Distribution system, Loss reduction, Network reconfiguration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38102632 The Role and Importance of Genome Sequencing in Prediction of Cancer Risk
Authors: M. Sadeghi, H. Pezeshk, R. Tusserkani, A. Sharifi Zarchi, A. Malekpour, M. Foroughmand, S. Goliaei, M. Totonchi, N. Ansari–Pour
Abstract:
The role and relative importance of intrinsic and extrinsic factors in the development of complex diseases such as cancer still remains a controversial issue. Determining the amount of variation explained by these factors needs experimental data and statistical models. These models are nevertheless based on the occurrence and accumulation of random mutational events during stem cell division, thus rendering cancer development a stochastic outcome. We demonstrate that not only individual genome sequencing is uninformative in determining cancer risk, but also assigning a unique genome sequence to any given individual (healthy or affected) is not meaningful. Current whole-genome sequencing approaches are therefore unlikely to realize the promise of personalized medicine. In conclusion, since genome sequence differs from cell to cell and changes over time, it seems that determining the risk factor of complex diseases based on genome sequence is somewhat unrealistic, and therefore, the resulting data are likely to be inherently uninformative.
Keywords: Cancer risk, extrinsic factors, genome sequencing, intrinsic factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11172631 The Relative Efficiency Based on the MSE in Generalized Ridge Estimate
Authors: Chao Yuan, Bao Guang Tian
Abstract:
A relative efficiency is defined as Ridge Estimate in the general linear model. The relative efficiency is based on the Mean square error. In this paper, we put forward a parameter of Ridge Estimate and discussions are made on the relative efficiency between the ridge estimation and the General Ridge Estimate. Eventually, this paper proves that the estimation is better than the general ridge estimate, which is based on the MSE.Keywords: Ridge estimate, generalized ridge estimate, MSE, relative efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9822630 The use of a Bespoke Computer Game For Teaching Analogue Electronics
Authors: Olaf Hallan Graven, Dag Andreas Hals Samuelsen
Abstract:
An implementation of a design for a game based virtual learning environment is described. The game is developed for a course in analogue electronics, and the topic is the design of a power supply. This task can be solved in a number of different ways, with certain constraints, giving the students a certain amount of freedom, although the game is designed not to facilitate trial-and error approach. The use of storytelling and a virtual gaming environment provides the student with the learning material in a MMORPG environment. The game is tested on a group of second year electrical engineering students with good results.Keywords: analogue electronics, e-learning, computer games for learning, virtual reality
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15292629 Study of Adaptive Filtering Algorithms and the Equalization of Radio Mobile Channel
Authors: Said Elkassimi, Said Safi, B. Manaut
Abstract:
This paper presented a study of three algorithms, the equalization algorithm to equalize the transmission channel with ZF and MMSE criteria, application of channel Bran A, and adaptive filtering algorithms LMS and RLS to estimate the parameters of the equalizer filter, i.e. move to the channel estimation and therefore reflect the temporal variations of the channel, and reduce the error in the transmitted signal. So far the performance of the algorithm equalizer with ZF and MMSE criteria both in the case without noise, a comparison of performance of the LMS and RLS algorithm.
Keywords: Adaptive filtering second equalizer, LMS, RLS Bran A, Proakis (B) MMSE, ZF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21242628 The Impact of Revenue Gap on Economic Growth: A Case Study of Pakistan
Authors: M. Ilyas, M. W. Siddiqi
Abstract:
This study employs auto-regressive distributed lag (ARDL) bounds approach to cointegration for long run and errorcorrection modeling (ECM) for short run analysis to examine the relationship between revenue gap and economic growth for Pakistan using annual time series data over the period 1980 to 2008. The short and long run results indicate that revenue gap is statistical significant and negatively effect economic growth. The significant and negative coefficient of error correction term in ECM indicates that after a shock, the long rum equilibrium will again converge towards equilibrium about 10.406 percent within a year.
Keywords: ARDL cointegration, Economic Growth, RevenueGap, Pakistan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25662627 An Archetype to Sustain Knowledge Management Systems through Intranet
Authors: B. T. Sayed, Nafaâ Jabeur, M. Aref
Abstract:
Creation and maintenance of knowledge management systems has been recognized as an important research area. Consecutively lack of accurate results from knowledge management systems limits the organization to apply their knowledge management processes. This leads to a failure in getting the right information to the right people at the right time thus followed by a deficiency in decision making processes. An Intranet offers a powerful tool for communication and collaboration, presenting data and information, and the means that creates and shares knowledge, all in one easily accessible place. This paper proposes an archetype describing how a knowledge management system, with the support of intranet capabilities, could very much increase the accuracy of capturing, storing and retrieving knowledge based processes thereby increasing the efficiency of the system. This system will expect a critical mass of usage, by the users, for intranet to function as knowledge management systems. This prototype would lead to a design of an application that would impose creation and maintenance of an effective knowledge management system through intranet. The aim of this paper is to introduce an effective system to handle capture, store and distribute knowledge management in a form that may not lead to any failure which exists in most of the systems. The methodology used in the system would require all the employees, in the organization, to contribute the maximum to deliver the system to a successful arena. The system is still in its initial mode and thereby the authors are under the process to practically implement the ideas, as mentioned in the system, to produce satisfactory results.Keywords: Knowledge Management Systems, Intranet, Methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19992626 Ovshinsky Effect by Quantum Mechanics
Authors: Thomas V. Prevenslik
Abstract:
Ovshinsky initiated scientific research in the field of amorphous and disordered materials that continues to this day. The Ovshinsky Effect where the resistance of thin GST films is significantly reduced upon the application of low voltage is of fundamental importance in phase-change - random access memory (PC-RAM) devices.GST stands for GdSbTe chalcogenide type glasses.However, the Ovshinsky Effect is not without controversy. Ovshinsky thought the resistance of GST films is reduced by the redistribution of charge carriers; whereas, others at that time including many PC-RAM researchers today argue that the GST resistance changes because the GST amorphous state is transformed to the crystalline state by melting, the heat supplied by external heaters. In this controversy, quantum mechanics (QM) asserts the heat capacity of GST films vanishes, and therefore melting cannot occur as the heat supplied cannot be conserved by an increase in GST film temperature.By precluding melting, QM re-opens the controversy between the melting and charge carrier mechanisms. Supporting analysis is presented to show that instead of increasing GST film temperature, conservation proceeds by the QED induced creation of photons within the GST film, the QED photons confined by TIR. QED stands for quantum electrodynamics and TIR for total internal reflection. The TIR confinement of QED photons is enhanced by the fact the absorbedheat energy absorbed in the GST film is concentrated in the TIR mode because of their high surface to volume ratio. The QED photons having Planck energy beyond the ultraviolet produce excitons by the photoelectric effect, the electrons and holes of which reduce the GST film resistance.Keywords: Ovshinsky, phase change memory, PC-RAM, chalcogenide, quantummechanics, quantum electrodynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16912625 pth Moment Exponential Synchronization of a Class of Chaotic Neural Networks with Mixed Delays
Authors: Zixin Liu, Shu Lü, Shouming Zhong, Mao Ye
Abstract:
This paper studies the pth moment exponential synchronization of a class of stochastic neural networks with mixed delays. Based on Lyapunov stability theory, by establishing a new integrodifferential inequality with mixed delays, several sufficient conditions have been derived to ensure the pth moment exponential stability for the error system. The criteria extend and improve some earlier results. One numerical example is presented to illustrate the validity of the main results.
Keywords: pth Moment Exponential synchronization, Stochastic, Neural networks, Mixed time delays
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15782624 Mental Vulnerability and Coping Strategies as a Factor for Academic Success for Pupils with Special Education Needs
Authors: T. Dubayova
Abstract:
Slovak, as well as foreign authors, believe that the influence of non-cognitive factors on a student's academic success or failure is unquestionable. The aim of this paper is to establish a link between the mental vulnerability and coping strategies used by 4th grade elementary school students in dealing with stressful situations and their academic performance, which was used as a simple quantitative indicator of academic success. The research sample consists of 320 students representing the standard population and 60 students with special education needs (SEN), who were assessed by the Strengths and Difficulties Questionnaire (SDQ) by their teachers and the Children’s Coping Strategies Checklist (CCSC-R1) filled in by themselves. Students with SEN recorded an extraordinarily high frequency of mental vulnerability (34.5 %) than students representing the standard population (7 %). The poorest academic performance of students with SEN was associated with the avoidance behavior displayed during stressful situations. Students of the standard population did not demonstrate this association. Students with SEN are more likely to display mental health problems than students of the standard population. This may be caused by the accumulation of and frequent exposure to situations that they perceive as stressful.Keywords: Coping, mental vulnerability, students with special education needs, academic performance, academic success.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15572623 Detection of Action Potentials in the Presence of Noise Using Phase-Space Techniques
Authors: Christopher Paterson, Richard Curry, Alan Purvis, Simon Johnson
Abstract:
Emerging Bio-engineering fields such as Brain Computer Interfaces, neuroprothesis devices and modeling and simulation of neural networks have led to increased research activity in algorithms for the detection, isolation and classification of Action Potentials (AP) from noisy data trains. Current techniques in the field of 'unsupervised no-prior knowledge' biosignal processing include energy operators, wavelet detection and adaptive thresholding. These tend to bias towards larger AP waveforms, AP may be missed due to deviations in spike shape and frequency and correlated noise spectrums can cause false detection. Also, such algorithms tend to suffer from large computational expense. A new signal detection technique based upon the ideas of phasespace diagrams and trajectories is proposed based upon the use of a delayed copy of the AP to highlight discontinuities relative to background noise. This idea has been used to create algorithms that are computationally inexpensive and address the above problems. Distinct AP have been picked out and manually classified from real physiological data recorded from a cockroach. To facilitate testing of the new technique, an Auto Regressive Moving Average (ARMA) noise model has been constructed bases upon background noise of the recordings. Along with the AP classification means this model enables generation of realistic neuronal data sets at arbitrary signal to noise ratio (SNR).Keywords: Action potential detection, Low SNR, Phase spacediagrams/trajectories, Unsupervised/no-prior knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643