Search results for: Critical trajectory method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9204

Search results for: Critical trajectory method

5364 Hippocampus Segmentation using a Local Prior Model on its Boundary

Authors: Dimitrios Zarpalas, Anastasios Zafeiropoulos, Petros Daras, Nicos Maglaveras

Abstract:

Segmentation techniques based on Active Contour Models have been strongly benefited from the use of prior information during their evolution. Shape prior information is captured from a training set and is introduced in the optimization procedure to restrict the evolution into allowable shapes. In this way, the evolution converges onto regions even with weak boundaries. Although significant effort has been devoted on different ways of capturing and analyzing prior information, very little thought has been devoted on the way of combining image information with prior information. This paper focuses on a more natural way of incorporating the prior information in the level set framework. For proof of concept the method is applied on hippocampus segmentation in T1-MR images. Hippocampus segmentation is a very challenging task, due to the multivariate surrounding region and the missing boundary with the neighboring amygdala, whose intensities are identical. The proposed method, mimics the human segmentation way and thus shows enhancements in the segmentation accuracy.

Keywords: Medical imaging & processing, Brain MRI segmentation, hippocampus segmentation, hippocampus-amygdala missingboundary, weak boundary segmentation, region based segmentation, prior information, local weighting scheme in level sets, spatialdistribution of labels, gradient distribution on boundary.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
5363 Innovation Strategies and Challenges in Emerging Economies: The Case of Research and Technology Organizations in Turkey

Authors: F. Demir

Abstract:

Innovation is highly critical for every company, especially for technology-based organizations looking to sustain their competitive advantage. However, this is not an easy task. Regardless of the size of the enterprise, market and location, all organizations face numerous challenges. Even though huge barriers to innovation exist in different countries, firm- and industry-specific challenges can be distinguished. This paper examines innovation strategies and obstacles to innovation in research and technology organizations (RTO) of Turkey. From the most important to the least, nine different challenges are ranked according the results of this survey. The findings reveal that to take the lead in innovation, financial constraint is the biggest challenge, which is consistent with the related literature. It ranked number one in this study. Beyond that, based on a sample of 40 RTOs, regional challenges such as underdeveloped regional innovation ecosystem plays a significant role in hampering innovation. Most of the organizations (55%) embrace an incremental approach to innovation, while only few pursue radical shifts. About 40% of the RTOs focus on product innovation, and 27.5% of them concentrate on technological innovation, while a very limited number aim for operational excellence and customer engagement as the focus of their strategic innovation efforts.

Keywords: Innovation strategies, innovation challenges, emerging economies, research and technology organizations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1276
5362 Tsunami Inundation Modeling in a Boundary Fitted Curvilinear Grid Model Using the Method of Lines Technique

Authors: M. Ashaque Meah, M. Shah Noor, M Asif Arefin, Md. Fazlul Karim

Abstract:

A numerical technique in a boundary-fitted curvilinear grid model is developed to simulate the extent of inland inundation along the coastal belts of Peninsular Malaysia and Southern Thailand due to 2004 Indian ocean tsunami. Tsunami propagation and run-up are also studied in this paper. The vertically integrated shallow water equations are solved by using the method of lines (MOL). For this purpose the boundary-fitted grids are generated along the coastal and island boundaries and the other open boundaries of the model domain. A transformation is used to the governing equations so that the transformed physical domain is converted into a rectangular one. The MOL technique is applied to the transformed shallow water equations and the boundary conditions so that the equations are converted into ordinary differential equations initial value problem. Finally the 4th order Runge-Kutta method is used to solve these ordinary differential equations. The moving boundary technique is applied instead of fixed sea side wall or fixed coastal boundary to ensure the movement of the coastal boundary. The extent of intrusion of water and associated tsunami propagation are simulated for the 2004 Indian Ocean tsunami along the west coast of Peninsular Malaysia and southern Thailand. The simulated results are compared with the results obtained from a finite difference model and the data available in the USGS website. All simulations show better approximation than earlier research and also show excellent agreement with the observed data.

Keywords: Open boundary condition, moving boundary condition, boundary-fitted curvilinear grids, far field tsunami, Shallow Water Equations, tsunami source, Indonesian tsunami of 2004.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 844
5361 Fuzzy Risk-Based Life Cycle Assessment for Estimating Environmental Aspects in EMS

Authors: Kevin Fong-Rey Liu, Ken Yeh, Cheng-Wu Chen, Han-Hsi Liang

Abstract:

Environmental aspects plays a central role in environmental management system (EMS) because it is the basis for the identification of an organization-s environmental targets. The existing methods for the assessment of environmental aspects are grouped into three categories: risk assessment-based (RA-based), LCA-based and criterion-based methods. To combine the benefits of these three categories of research, this study proposes an integrated framework, combining RA-, LCA- and criterion-based methods. The integrated framework incorporates LCA techniques for the identification of the causal linkage for aspect, pathway, receptor and impact, uses fuzzy logic to assess aspects, considers fuzzy conditions, in likelihood assessment, and employs a new multi-criteria decision analysis method - multi-criteria and multi-connection comprehensive assessment (MMCA) - to estimate significant aspects in EMS. The proposed model is verified, using a real case study and the results show that this method successfully prioritizes the environmental aspects.

Keywords: Environmental management system, environmental aspect, risk assessment, life cycle assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2200
5360 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement

Authors: Pogula Rakesh, T. Kishore Kumar

Abstract:

Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR) and SNR Loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.

Keywords: Adaptive filter, Adaptive Noise Canceller, Mean Squared Error, Noise reduction, NLMS, RLS, SNR, SNR Loss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3165
5359 The Influence of Beta Shape Parameters in Project Planning

Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou

Abstract:

Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.

Keywords: Beta distribution, PERT, Monte Carlo Simulation, skewness, project completion time distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 749
5358 Performance Analysis of a Discrete-time GeoX/G/1 Queue with Single Working Vacation

Authors: Shan Gao, Zaiming Liu

Abstract:

This paper treats a discrete-time batch arrival queue with single working vacation. The main purpose of this paper is to present a performance analysis of this system by using the supplementary variable technique. For this purpose, we first analyze the Markov chain underlying the queueing system and obtain its ergodicity condition. Next, we present the stationary distributions of the system length as well as some performance measures at random epochs by using the supplementary variable method. Thirdly, still based on the supplementary variable method we give the probability generating function (PGF) of the number of customers at the beginning of a busy period and give a stochastic decomposition formulae for the PGF of the stationary system length at the departure epochs. Additionally, we investigate the relation between our discretetime system and its continuous counterpart. Finally, some numerical examples show the influence of the parameters on some crucial performance characteristics of the system.

Keywords: Discrete-time queue, batch arrival, working vacation, supplementary variable technique, stochastic decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
5357 Thermal Stability Boundary of FG Panel under Aerodynamic Load

Authors: Sang-Lae Lee, Ji-Hwan Kim

Abstract:

In this study, it is investigated the stability boundary of Functionally Graded (FG) panel under the heats and supersonic airflows. Material properties are assumed to be temperature dependent, and a simple power law distribution is taken. First-order shear deformation theory (FSDT) of plate is applied to model the panel, and the von-Karman strain- displacement relations are adopted to consider the geometric nonlinearity due to large deformation. Further, the first-order piston theory is used to model the supersonic aerodynamic load acting on a panel and Rayleigh damping coefficient is used to present the structural damping. In order to find a critical value of the speed, linear flutter analysis of FG panels is performed. Numerical results are compared with the previous works, and present results for the temperature dependent material are discussed in detail for stability boundary of the panel with various volume fractions, and aerodynamic pressures.

Keywords: Functionally graded panels, Linear flutter analysis, Supersonic airflows, Temperature dependent material property.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
5356 Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image

Authors: Yohei Saika, Yuji Haraguchi

Abstract:

We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.

Keywords: Noise reduction, JPEG-compressed image, Bayesian inference, the maximizer of the posterior marginal estimate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
5355 Mathematical Modeling to Predict Surface Roughness in CNC Milling

Authors: Ab. Rashid M.F.F., Gan S.Y., Muhammad N.Y.

Abstract:

Surface roughness (Ra) is one of the most important requirements in machining process. In order to obtain better surface roughness, the proper setting of cutting parameters is crucial before the process take place. This research presents the development of mathematical model for surface roughness prediction before milling process in order to evaluate the fitness of machining parameters; spindle speed, feed rate and depth of cut. 84 samples were run in this study by using FANUC CNC Milling α-Τ14ιE. Those samples were randomly divided into two data sets- the training sets (m=60) and testing sets(m=24). ANOVA analysis showed that at least one of the population regression coefficients was not zero. Multiple Regression Method was used to determine the correlation between a criterion variable and a combination of predictor variables. It was established that the surface roughness is most influenced by the feed rate. By using Multiple Regression Method equation, the average percentage deviation of the testing set was 9.8% and 9.7% for training data set. This showed that the statistical model could predict the surface roughness with about 90.2% accuracy of the testing data set and 90.3% accuracy of the training data set.

Keywords: Surface roughness, regression analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
5354 A New Source Code Auditing Algorithm for Detecting LFI and RFI in PHP Programs

Authors: Seyed Ali Mir Heydari, Mohsen Sayadiharikandeh

Abstract:

Static analysis of source code is used for auditing web applications to detect the vulnerabilities. In this paper, we propose a new algorithm to analyze the PHP source code for detecting LFI and RFI potential vulnerabilities. In our approach, we first define some patterns for finding some functions which have potential to be abused because of unhandled user inputs. More precisely, we use regular expression as a fast and simple method to define some patterns for detection of vulnerabilities. As inclusion functions could be also used in a safe way, there could occur many false positives (FP). The first cause of these FP-s could be that the function does not use a usersupplied variable as an argument. So, we extract a list of usersupplied variables to be used for detecting vulnerable lines of code. On the other side, as vulnerability could spread among the variables like by multi-level assignment, we also try to extract the hidden usersupplied variables. We use the resulted list to decrease the false positives of our method. Finally, as there exist some ways to prevent the vulnerability of inclusion functions, we define also some patterns to detect them and decrease our false positives.

Keywords: User-supplied Variables, hidden user-supplied variables, PHP vulnerabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2490
5353 Temperature Dependence of Relative Permittivity: A Measurement Technique Using Split Ring Resonators

Authors: Sreedevi P. Chakyar, Jolly Andrews, V. P. Joseph

Abstract:

A compact method for measuring the relative permittivity of a dielectric material at different temperatures using a single circular Split Ring Resonator (SRR) metamaterial unit working as a test probe is presented in this paper. The dielectric constant of a material is dependent upon its temperature and the LC resonance of the SRR depends on its dielectric environment. Hence, the temperature of the dielectric material in contact with the resonator influences its resonant frequency. A single SRR placed between transmitting and receiving probes connected to a Vector Network Analyser (VNA) is used as a test probe. The dependence of temperature between 30 oC and 60 oC on resonant frequency of SRR is analysed. Relative permittivities ‘ε’ of test samples for different temperatures are extracted from a calibration graph drawn between the relative permittivity of samples of known dielectric constant and their corresponding resonant frequencies. This method is found to be an easy and efficient technique for analysing the temperature dependent permittivity of different materials.

Keywords: Metamaterials, negative permeability, permittivity measurement techniques, split ring resonators, temperature dependent dielectric constant.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2556
5352 Military Attack Helicopter Selection Using Distance Function Measures in Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

This paper aims to select the best military attack helicopter to purchase by the Armed Forces and provide greater reconnaissance and offensive combat capability in military operations. For this purpose, a multiple criteria decision analysis method integrated with the variance weight procedure was applied to the military attack helicopter selection problem. A real military aviation case problem is conducted to support the Armed Forces decision-making process and contributes to the better performance of the Armed Forces. Application of the methodology resulted in ranking lists for ordering and prioritizing attack helicopters, providing transparency and simplicity to the decision-making process. Nine military attack helicopter models were analyzed in the light of strategic, tactical, and operational criteria, considering attack helicopters. The selected military attack helicopter would be used for fire support and reconnaissance activities required by the Armed Forces operation. This study makes a valuable contribution to the problem of military attack helicopter selection, as it represents a state-of-the-art application of the MCDMA method to contribute to the solution of a real problem of the Armed Forces. The methodology presented in this paper can be used to solve real problems of a wide variety, especially strategic, tactical and operational, and is, therefore, a very useful method for decision making.

Keywords: aircraft selection, military attack helicopter selection, attack helicopter fleet planning, MCDMA, multiple criteria analysis, multiple criteria decision making analysis, distance function measure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
5351 Swarmed Discriminant Analysis for Multifunction Prosthesis Control

Authors: Rami N. Khushaba, Ahmed Al-Ani, Adel Al-Jumaily

Abstract:

One of the approaches enabling people with amputated limbs to establish some sort of interface with the real world includes the utilization of the myoelectric signal (MES) from the remaining muscles of those limbs. The MES can be used as a control input to a multifunction prosthetic device. In this control scheme, known as the myoelectric control, a pattern recognition approach is usually utilized to discriminate between the MES signals that belong to different classes of the forearm movements. Since the MES is recorded using multiple channels, the feature vector size can become very large. In order to reduce the computational cost and enhance the generalization capability of the classifier, a dimensionality reduction method is needed to identify an informative yet moderate size feature set. This paper proposes a new fuzzy version of the well known Fisher-s Linear Discriminant Analysis (LDA) feature projection technique. Furthermore, based on the fact that certain muscles might contribute more to the discrimination process, a novel feature weighting scheme is also presented by employing Particle Swarm Optimization (PSO) for estimating the weight of each feature. The new method, called PSOFLDA, is tested on real MES datasets and compared with other techniques to prove its superiority.

Keywords: Discriminant Analysis, Pattern Recognition, SignalProcessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
5350 Linear Programming Application in Unit Commitment of Wind Farms with Considering Uncertainties

Authors: M. Esmaeeli Shahrakht, A. Kazemi

Abstract:

Due to uncertainty of wind velocity, wind power generators don’t have deterministic output power. Utilizing wind power generation and thermal power plants together create new concerns for operation engineers of power systems. In this paper, a model is presented to implement the uncertainty of load and generated wind power which can be utilized in power system operation planning. Stochastic behavior of parameters is simulated by generating scenarios that can be solved by deterministic method. A mixed-integer linear programming method is used for solving deterministic generation scheduling problem. The proposed approach is applied to a 12-unit test system including 10 thermal units and 2 wind farms. The results show affectivity of piecewise linear model in unit commitment problems. Also using linear programming causes a considerable reduction in calculation times and guarantees convergence to the global optimum. Neglecting the uncertainty of wind velocity causes higher cost assessment of generation scheduling.

Keywords: Load uncertainty, linear programming, scenario generation, unit commitment, wind farm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2918
5349 Bond Graph and Bayesian Networks for Reliable Diagnosis

Authors: Abdelaziz Zaidi, Belkacem Ould Bouamama, Moncef Tagina

Abstract:

Bond Graph as a unified multidisciplinary tool is widely used not only for dynamic modelling but also for Fault Detection and Isolation because of its structural and causal proprieties. A binary Fault Signature Matrix is systematically generated but to make the final binary decision is not always feasible because of the problems revealed by such method. The purpose of this paper is introducing a methodology for the improvement of the classical binary method of decision-making, so that the unknown and identical failure signatures can be treated to improve the robustness. This approach consists of associating the evaluated residuals and the components reliability data to build a Hybrid Bayesian Network. This network is used in two distinct inference procedures: one for the continuous part and the other for the discrete part. The continuous nodes of the network are the prior probabilities of the components failures, which are used by the inference procedure on the discrete part to compute the posterior probabilities of the failures. The developed methodology is applied to a real steam generator pilot process.

Keywords: Redundancy relations, decision-making, Bond Graph, reliability, Bayesian Networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2505
5348 Secure Cryptographic Operations on SIM Card for Mobile Financial Services

Authors: Kerem Ok, Serafettin Senturk, Serdar Aktas, Cem Cevikbas

Abstract:

Mobile technology is very popular nowadays and it provides a digital world where users can experience many value-added services. Service Providers are also eager to offer diverse value-added services to users such as digital identity, mobile financial services and so on. In this context, the security of data storage in smartphones and the security of communication between the smartphone and service provider are critical for the success of these services. In order to provide the required security functions, the SIM card is one acceptable alternative. Since SIM cards include a Secure Element, they are able to store sensitive data, create cryptographically secure keys, encrypt and decrypt data. In this paper, we design and implement a SIM and a smartphone framework that uses a SIM card for secure key generation, key storage, data encryption, data decryption and digital signing for mobile financial services. Our frameworks show that the SIM card can be used as a controlled Secure Element to provide required security functions for popular e-services such as mobile financial services.

Keywords: SIM Card, mobile financial services, cryptography, secure data storage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2041
5347 Efficacy of Combined CHAp and Lanthanum Carbonate in Therapy for Hyperphosphatemia

Authors: Andreea Cârâc, Elena Moroşan, Ana Corina Ioniță, Rica Boscencu, Geta Cârâc

Abstract:

Although, lanthanum carbonate has not been approved by the FDA for treatment of hyperphosphatemia, we prospectively evaluated the efficacy of the combination of Calcium hydroxyapatite (CHAp) and Lanthanum Carbonate (LaC) for the treatment of hyperphosphatemia on mice. CHAp was prepared by co-precipitation method using Ca(OH)2, H3PO4, NH4OH with calcination at 1200ºC. Lanthanum carbonate was prepared by chemical method using NaHCO3 and LaCl3 at low pH environment, below 4.0. The structures were characterized by FTIR spectra and SEM -EDX analysis. The study group included 16 subjects-mice divided into four groups according to the administered substance: lanthanum carbonate (group A), CHAp (group B), lanthanum carbonate + CHAp (group C) and salt water (group D). The results indicate a phosphate decrease when subjects (mice) were treated with CHAp and lanthanum carbonate (0.5% CMC), in a single dose of 1500 mg/kg. Serum phosphate concentration decreased [(from 4.5 ± 0.8 mg/dL) to 4.05 ± 0.2 mg/dL), P < 0.01] in group A and in group C (to 3.6 ± 0.2 mg/dL) at 12 hours from the administration. The combination of CHAp and lanthanum carbonate is a suitable regimen for hyperphosphatemia treatment because it avoids both the hypercalcemia of CaCO3 and the adverse effects of CHAp.

Keywords: Calcium hydroxyapatite, hyperphosphatemia, lanthanum carbonate, phosphatebinder, structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
5346 Detecting and Locating Wormhole Attacks in Wireless Sensor Networks Using Beacon Nodes

Authors: He Ronghui, Ma Guoqing, Wang Chunlei, Fang Lan

Abstract:

This paper focuses on wormhole attacks detection in wireless sensor networks. The wormhole attack is particularly challenging to deal with since the adversary does not need to compromise any nodes and can use laptops or other wireless devices to send the packets on a low latency channel. This paper introduces an easy and effective method to detect and locate the wormholes: Since beacon nodes are assumed to know their coordinates, the straight line distance between each pair of them can be calculated and then compared with the corresponding hop distance, which in this paper equals hop counts × node-s transmission range R. Dramatic difference may emerge because of an existing wormhole. Our detection mechanism is based on this. The approximate location of the wormhole can also be derived in further steps based on this information. To the best of our knowledge, our method is much easier than other wormhole detecting schemes which also use beacon nodes, and to those have special requirements on each nodes (e.g., GPS receivers or tightly synchronized clocks or directional antennas), ours is more economical. Simulation results show that the algorithm is successful in detecting and locating wormholes when the density of beacon nodes reaches 0.008 per m2.

Keywords: Beacon node, wireless sensor network, worm hole attack.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
5345 Multidimensional Compromise Programming Evaluation of Digital Commerce Websites

Authors: C. Ardil

Abstract:

Multidimensional compromise programming evaluation of digital commerce websites is essential not only to have recommendations for improvement, but also to make comparisons with global business competitors. This research provides a multidimensional decision making model that prioritizes the objective criteria weights of various commerce websites using multidimensional compromise solution. Evaluation of digital commerce website quality can be considered as a complex information system structure including qualitative and quantitative factors for a multicriteria decision making problem. The proposed multicriteria decision making approach mainly consists of three sequential steps for the selection problem. In the first step, three major different evaluation criteria are characterized for website ranking problem. In the second step, identified critical criteria are weighted using the standard deviation procedure. In the third step, the multidimensional compromise programming is applied to rank the digital commerce websites.

Keywords: Standard deviation, commerce website, website evaluation, multicriteria decision making, multicriteria compromise programming, website quality, multidimensional decision analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 792
5344 Technique for Processing and Preservation of Human Amniotic Membrane for Ocular Surface Reconstruction

Authors: Irfan Z. Qureshi, Fareeha A., Wajid A. Khan

Abstract:

Human amniotic membrane (HAM) is a useful biological material for the reconstruction of damaged ocular surface. The processing and preservation of HAM is critical to prevent the patients undergoing amniotic membrane transplant (AMT) from cross infections. For HAM preparation human placenta is obtained after an elective cesarean delivery. Before collection, the donor is screened for seronegativity of HCV, Hbs Ag, HIV and Syphilis. After collection, placenta is washed in balanced salt solution (BSS) in sterile environment. Amniotic membrane is then separated from the placenta as well as chorion while keeping the preparation in BSS. Scrapping of HAM is then carried out manually until all the debris is removed and clear transparent membrane is acquired. Nitrocellulose membrane filters are then placed on the stromal side of HAM, cut around the edges with little membrane folded towards other side making it easy to separate during surgery. HAM is finally stored in solution of glycerine and Dulbecco-s Modified Eagle Medium (DMEM) in 1:1 ratio containing antibiotics. The capped borosil vials containing HAM are kept at -80°C until use. This vial is thawed to room temperature and opened under sterile operation theatre conditions at the time of surgery.

Keywords: HAM, AMT, ocular transplant

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3530
5343 Multiple Criteria Decision Making Analysis for Selecting and Evaluating Fighter Aircraft

Authors: C. Ardil, A. M. Pashaev, R.A. Sadiqov, P. Abdullayev

Abstract:

In this paper, multiple criteria decision making analysis technique, is presented for ranking and selection of a set of determined alternatives - fighter aircraft - which are associated with a set of decision factors. In fighter aircraft design, conflicting decision criteria, disciplines, and technologies are always involved in the design process. Multiple criteria decision making analysis techniques can be helpful to effectively deal with such situations and make wise design decisions. Multiple criteria decision making analysis theory is a systematic mathematical approach for dealing with problems which contain uncertainties in decision making. The feasibility and contributions of applying the multiple criteria decision making analysis technique in fighter aircraft selection analysis is explored. In this study, an integrated framework incorporating multiple criteria decision making analysis technique in fighter aircraft analysis is established using entropy objective weighting method. An improved integrated multiple criteria decision making analysis method is utilized to aggregate the multiple decision criteria into one composite figure of merit, which serves as an objective function in the decision process. Therefore, it is demonstrated that the suitable multiple criteria decision making analysis method with decision solution provides an effective objective function for the decision making analysis. Considering that the inherent uncertainties and the weighting factors have crucial decision impacts on the fighter aircraft evaluation, seven fighter aircraft models for the multiple design criteria in terms of the weighting factors are constructed. The proposed multiple criteria decision making analysis model is based on integrated entropy index procedure, and additive multiple criteria decision making analysis theory. Hence, the applicability of proposed technique for fighter aircraft selection problem is considered. The constructed multiple criteria decision making analysis model can provide efficient decision analysis approach for uncertainty assessment of the decision problem. Consequently, the fighter aircraft alternatives are ranked based their final evaluation scores, and sensitivity analysis is conducted.

Keywords: Fighter Aircraft, Fighter Aircraft Selection, Multiple Criteria Decision Making, Multiple Criteria Decision Making Analysis, MCDMA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 598
5342 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines

Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé

Abstract:

The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).

Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307
5341 Performance Comparison of Particle Swarm Optimization with Traditional Clustering Algorithms used in Self-Organizing Map

Authors: Anurag Sharma, Christian W. Omlin

Abstract:

Self-organizing map (SOM) is a well known data reduction technique used in data mining. It can reveal structure in data sets through data visualization that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOM, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of an adaptive heuristic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOM. The application of our method to several standard data sets demonstrates its feasibility. PSO algorithm utilizes a so-called U-matrix of SOM to determine cluster boundaries; the results of this novel automatic method compare very favorably to boundary detection through traditional algorithms namely k-means and hierarchical based approach which are normally used to interpret the output of SOM.

Keywords: cluster boundaries, clustering, code vectors, data mining, particle swarm optimization, self-organizing maps, U-matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1891
5340 Sustainable Energy Production with Closed-Loop Methods: Evaluating the Influence of Power Plant Age on Production Efficiency and Environmental Impact

Authors: Bujar Ismaili, Bahti Ismajli, Venhar Ismaili, Skender Ramadani

Abstract:

In Kosovo, the problem with the electricity supply is huge and it does not meet the demands of consumers. Older thermal power plants, which are regarded as big environmental polluters, produce most of the energy. Our experiment is based on the production of electricity using the closed method that does not affect environmental pollution by using waste as fuel that is considered to pollute the environment. The experiment was carried out in the village of Godanc, municipality of Shtime, Kosovo. In the experiment, a production line based on the production of electricity and central heating was designed at the same time. The results are the benefits of electricity as well as the release of temperature for heating with minimal expenses and with the release of 0% gases into the atmosphere. During this experiment, coal, plastic, waste from wood processing, and agricultural wastes were used as raw materials. The method utilized in the experiment allows for the release of gas through pipes and filters during the top-to-bottom combustion of the raw material in the boiler, followed by the method of gas filtration from waste wood processing (sawdust). During this process, the final product, gas, is obtained. This gas passes through the carburetor, enabling the combustion process to put the internal combustion machine and the generator into operation and produce electricity that does not release gases into the atmosphere. The results show that the system provides energy stability without environmental pollution from toxic substances and waste, as well as with low production costs. From the final results, it follows that, in the case of using coal fuel, we have benefited from more electricity and higher temperature release, followed by plastic waste, which also gave good results. The results obtained during these experiments prove that the current problems of lack of electricity and heating can be met at a lower cost and have a clean environment and waste management.

Keywords: Energy, heating, atmosphere, waste management, gasification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182
5339 Wavelet Enhanced CCA for Minimization of Ocular and Muscle Artifacts in EEG

Authors: B. S. Raghavendra, D. Narayana Dutt

Abstract:

Electroencephalogram (EEG) recordings are often contaminated with ocular and muscle artifacts. In this paper, the canonical correlation analysis (CCA) is used as blind source separation (BSS) technique (BSS-CCA) to decompose the artifact contaminated EEG into component signals. We combine the BSSCCA technique with wavelet filtering approach for minimizing both ocular and muscle artifacts simultaneously, and refer the proposed method as wavelet enhanced BSS-CCA. In this approach, after careful visual inspection, the muscle artifact components are discarded and ocular artifact components are subjected to wavelet filtering to retain high frequency cerebral information, and then clean EEG is reconstructed. The performance of the proposed wavelet enhanced BSS-CCA method is tested on real EEG recordings contaminated with ocular and muscle artifacts, for which power spectral density is used as a quantitative measure. Our results suggest that the proposed hybrid approach minimizes ocular and muscle artifacts effectively, minimally affecting underlying cerebral activity in EEG recordings.

Keywords: Blind source separation, Canonical correlationanalysis, Electroencephalogram, Muscle artifact, Ocular artifact, Power spectrum, Wavelet threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2313
5338 Prediction of Dissolved Oxygen in Rivers Using a Wang-Mendel Method – Case Study of Au Sable River

Authors: Mahmoud R. Shaghaghian

Abstract:

Amount of dissolve oxygen in a river has a great direct affect on aquatic macroinvertebrates and this would influence on the region ecosystem indirectly. In this paper it is tried to predict dissolved oxygen in rivers by employing an easy Fuzzy Logic Modeling, Wang Mendel method. This model just uses previous records to estimate upcoming values. For this purpose daily and hourly records of eight stations in Au Sable watershed in Michigan, United States are employed for 12 years and 50 days period respectively. Calculations indicate that for long period prediction it is better to increase input intervals. But for filling missed data it is advisable to decrease the interval. Increasing partitioning of input and output features influence a little on accuracy but make the model too time consuming. Increment in number of input data also act like number of partitioning. Large amount of train data does not modify accuracy essentially, so, an optimum training length should be selected.

Keywords: Dissolved oxygen, Au Sable, fuzzy logic modeling, Wang Mendel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
5337 Thermoelastic Waves in Anisotropic Platesusing Normal Mode Expansion Method with Thermal Relaxation Time

Authors: K.L. Verma

Abstract:

Analysis for the generalized thermoelastic Lamb waves, which propagates in anisotropic thin plates in generalized thermoelasticity, is presented employing normal mode expansion method. The displacement and temperature fields are expressed by a summation of the symmetric and antisymmetric thermoelastic modes in the surface thermal stresses and thermal gradient free orthotropic plate, therefore the theory is particularly appropriate for waveform analyses of Lamb waves in thin anisotropic plates. The transient waveforms excited by the thermoelastic expansion are analyzed for an orthotropic thin plate. The obtained results show that the theory provides a quantitative analysis to characterize anisotropic thermoelastic stiffness properties of plates by wave detection. Finally numerical calculations have been presented for a NaF crystal, and the dispersion curves for the lowest modes of the symmetric and antisymmetric vibrations are represented graphically at different values of thermal relaxation time. However, the methods can be used for other materials as well

Keywords: Anisotropic, dispersion, frequency, normal, thermoelasticity, wave modes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1824
5336 Splitting Modified Donor-Cell Schemes for Spectral Action Balance Equation

Authors: Tanapat Brikshavana, Anirut Luadsong

Abstract:

The spectral action balance equation is an equation that used to simulate short-crested wind-generated waves in shallow water areas such as coastal regions and inland waters. This equation consists of two spatial dimensions, wave direction, and wave frequency which can be solved by finite difference method. When this equation with dominating propagation velocity terms are discretized using central differences, stability problems occur when the grid spacing is chosen too coarse. In this paper, we introduce the splitting modified donorcell scheme for avoiding stability problems and prove that it is consistent to the modified donor-cell scheme with same accuracy. The splitting modified donor-cell scheme was adopted to split the wave spectral action balance equation into four one-dimensional problems, which for each small problem obtains the independently tridiagonal linear systems. For each smaller system can be solved by direct or iterative methods at the same time which is very fast when performed by a multi-cores computer.

Keywords: donor-cell scheme, parallel algorithm, spectral action balance equation, splitting method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473
5335 Development of a Multi-Factorial Instrument for Accident Analysis Based on Systemic Methods

Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu

Abstract:

The present research is built on three major pillars, commencing by making some considerations on accident investigation methods and pointing out both defining aspects and differences between linear and non-linear analysis. The traditional linear focus on accident analysis describes accidents as a sequence of events, while the latest systemic models outline interdependencies between different factors and define the processes evolution related to a specific (normal) situation. Linear and non-linear accident analysis methods have specific limitations, so the second point of interest is mirrored by the aim to discover the drawbacks of systemic models which becomes a starting point for developing new directions to identify risks or data closer to the cause of incidents/accidents. Since communication represents a critical issue in the interaction of human factor and has been proved to be the answer of the problems made by possible breakdowns in different communication procedures, from this focus point, on the third pylon a new error-modeling instrument suitable for risk assessment/accident analysis will be elaborated.

Keywords: Accident analysis, multi-factorial error modeling, risk, systemic methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1017