Search results for: Wind Turbine Type Selection
2050 Experimental Measurements of Mean and Turbulence Quantities behind the Circular Cylinder by Attaching Different Number of Tripping Wires
Authors: Amir Bak Khoshnevis, Mahdieh Khodadadi, Aghil Lotfi
Abstract:
For a bluff body, roughness elements in simulating a turbulent boundary layer, leading to delayed flow separation, a smaller wake, and lower form drag. In the present work, flow past a circular cylinder with using tripping wires is studied experimentally. The wind tunnel used for modeling free stream is open blow circuit (maximum speed = 30m/s and maximum turbulence of free stream = 0.1%). The selected Reynolds number for all tests was constant (Re = 25000). The circular cylinder selected for this experiment is 20 and 400mm in diameter and length, respectively. The aim of this research is to find the optimal operation mode. In this study installed some tripping wires 1mm in diameter, with a different number of wires on the circular cylinder and the wake characteristics of the circular cylinder is studied. Results showed that by increasing number of tripping wires attached to the circular cylinder (6, 8, and 10, respectively), The optimal angle for the tripping wires with 1mm in diameter to be installed on the cylinder is 60̊ (or 6 wires required at angle difference of 60̊). Strouhal number for the cylinder with tripping wires 1mm in diameter at angular position 60̊ showed the maximum value.
Keywords: Wake of a circular cylinder, trip wire, velocity defect, Strouhal number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9942049 The Effect of Ageing Treatment of Aluminum Alloys for Fuselage Structure-Light Aircraft
Authors: Shwe Wut Hmon Aye, Kay Thi Lwin, Waing Waing Kay Khine Oo
Abstract:
As the material used for fuselage structure must possess low density, high strength to weight ratio, the selection of appropriate materials for fuselage structure is one of the most important tasks. Aluminum metal itself is soft and low in strength. It can be made stronger by giving proper combination of suitable alloy addition, mechanical treatment and thermal treatment. The usual thermal treatment given to aluminum alloys is called age-hardening or precipitation hardening. In this paper, the studies are carried out on 7075 aluminum alloy which is how to improve strength level for fuselage structure. The marked effect of the strength on the ternary alloy is clearly demonstrated at several ageing times and temperatures. It is concluded that aluminum-zinc-magnesium alloy can get the highest strength level in natural ageing.Keywords: Aluminum alloy, ageing, heat treatment, strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23252048 Principal Component Analysis-Ranking as a Variable Selection Method for the Simultaneous Spectrophotometric Determination of Phenol, Resorcinol and Catechol in Real Samples
Authors: Nahid Ghasemi, Mohammad Goodarzi, Morteza Khosravi
Abstract:
Simultaneous determination of multicomponents of phenol, resorcinol and catechol with a chemometric technique a PCranking artificial neural network (PCranking-ANN) algorithm is reported in this study. Based on the data correlation coefficient method, 3 representative PCs are selected from the scores of original UV spectral data (35 PCs) as the original input patterns for ANN to build a neural network model. The results obtained by iterating 8000 .The RMSEP for phenol, resorcinol and catechol with PCranking- ANN were 0.6680, 0.0766 and 0.1033, respectively. Calibration matrices were 0.50-21.0, 0.50-15.1 and 0.50-20.0 μg ml-1 for phenol, resorcinol and catechol, respectively. The proposed method was successfully applied for the determination of phenol, resorcinol and catechol in synthetic and water samples.
Keywords: Phenol, Resorcinol, Catechol, Principal componentrankingArtificial Neural Network, Chemometrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14302047 Analysis of a Faience Enema Found in the Assasif Tomb No. -28- of the Vizier Amenhotep Huy: Contributions to the Study of the Mummification Ritual Practiced in the Theban Necropolis
Authors: Alberto Abello Moreno-Cid
Abstract:
Mummification was the process through which immortality was granted to the deceased, so it was of extreme importance to the Egyptians. The techniques of embalming had evolved over the centuries, and specialists created increasingly sophisticated tools. However, due to its eminently religious nature, knowledge about everything related to this practice was jealously preserved, and the testimonies that have survived to our time are scarce. For this reason, embalming instruments found in archaeological excavations are uncommon. The tomb of the Vizier Amenhotep Huy (AT No. -28-), located in the el-Assasif necropolis that is being excavated since 2009 by the team of the Institute of Ancient Egyptian Studies, has been the scene of some discoveries of this type that evidences the existence of mummification practices in this place after the New Kingdom. The clysters or enemas are the fundamental tools in the second type of mummification described by the historian Herodotus to introduce caustic solutions inside the body of the deceased. Nevertheless, such objects only have been found in three locations: the tomb of Ankh-Hor in Luxor, where a copper enema belonged to the prophet of Ammon Uah-ib-Ra came to light; the excavation of the tomb of Menekh-ib-Nekau in Abusir, where was also found one made of copper; and the excavations in the Bucheum, where two more artifacts were discovered, also made of copper but in different shapes and sizes. Both of them were used for the mummification of sacred animals and this is the reason they vary significantly. Therefore, the object found in the tomb No. -28-, is the first known made of faience of all these peculiar tools and the oldest known until now, dated in the Third Intermediate Period (circa 1070-650 B.C.). This paper bases its investigation on the study of those parallelisms, the material, the current archaeological context and the full analysis and reconstruction of the object in question. The key point is the use of faience in the production of this item: creating a device intended to be in constant use seems to be a first illogical compared to other samples made of copper. Faience around the area of Deir el-Bahari had a strong religious component, associated with solar myths and principles of the resurrection, connected to the Osirian that characterises the mummification procedure. The study allows to refute some of the premises which are held unalterable in Egyptology, verifying the utilization of these sort of pieces, understanding its way of use and showing that this type of mummification was also applied to the highest social stratum, in which case the tools were thought out of an exceptional quality and religious symbolism.
Keywords: Clyster, el-Assasif, embalming, faience enema, mummification, Theban necropolis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6762046 Non-negative Principal Component Analysis for Face Recognition
Abstract:
Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.Keywords: classification, face recognition, non-negativeprinciple component analysis (NPCA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16982045 Multiclass Support Vector Machines for Environmental Sounds Classification Using log-Gabor Filters
Authors: S. Souli, Z. Lachiri
Abstract:
In this paper we propose a robust environmental sound classification approach, based on spectrograms features driven from log-Gabor filters. This approach includes two methods. In the first methods, the spectrograms are passed through an appropriate log-Gabor filter banks and the outputs are averaged and underwent an optimal feature selection procedure based on a mutual information criteria. The second method uses the same steps but applied only to three patches extracted from each spectrogram.
To investigate the accuracy of the proposed methods, we conduct experiments using a large database containing 10 environmental sound classes. The classification results based on Multiclass Support Vector Machines show that the second method is the most efficient with an average classification accuracy of 89.62 %.
Keywords: Environmental sounds, Log-Gabor filters, Spectrogram, SVM Multiclass, Visual features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17492044 Study of the Tribological Behavior of a Pin on Disc Type of Contact
Authors: S. Djebali, S. Larbi, A. Bilek
Abstract:
The present work aims at contributing to the study of the complex phenomenon of wear of pin on disc contact in dry sliding friction between two material couples (bronze/steel and unsaturated polyester virgin and charged with graphite powder/steel). The work consists of the determination of the coefficient of friction, the study of the influence of the tribological parameters on this coefficient and the determination of the mass loss and the wear rate of the pin. This study is also widened to the highlighting of the influence of the addition of graphite powder on the tribological properties of the polymer constituting the pin. The experiments are carried out on a pin-disc type tribometer that we have designed and manufactured. Tests are conducted according to the standards DIN 50321 and DIN EN 50324. The discs are made of annealed XC48 steel and quenched and tempered XC48 steel. The main results are described here after. The increase of the normal load and the sliding speed causes the increase of the friction coefficient, whereas the increase of the percentage of graphite and the hardness of the disc surface contributes to its reduction. The mass loss also increases with the normal load. The influence of the normal load on the friction coefficient is more significant than that of the sliding speed. The effect of the sliding speed decreases for large speed values. The increase of the amount of graphite powder leads to a decrease of the coefficient of friction, the mass loss and the wear rate. The addition of graphite to the UP resin is beneficial; it plays the role of solid lubricant.
Keywords: Friction coefficients, mass loss, wear rate, bronze, polyester, graphite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12732043 Efficient Implementation of Serial and Parallel Support Vector Machine Training with a Multi-Parameter Kernel for Large-Scale Data Mining
Authors: Tatjana Eitrich, Bruno Lang
Abstract:
This work deals with aspects of support vector learning for large-scale data mining tasks. Based on a decomposition algorithm that can be run in serial and parallel mode we introduce a data transformation that allows for the usage of an expensive generalized kernel without additional costs. In order to speed up the decomposition algorithm we analyze the problem of working set selection for large data sets and analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our modifications and settings lead to improvement of support vector learning performance and thus allow using extensive parameter search methods to optimize classification accuracy.
Keywords: Support Vector Machines, Shared Memory Parallel Computing, Large Data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15802042 Study on Compressive Strength and Setting Times of Fly Ash Concrete after Slump Recovery Using Superplasticizer
Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert
Abstract:
Fresh concrete has one of dynamic properties known as slump. Slump of concrete is design to compatible with placing method. Due to hydration reaction of cement, the slump of concrete is loss through time. Therefore, delayed concrete probably get reject because slump is unacceptable. In order to recover the slump of delayed concrete the second dose of superplasticizer (naphthalene based type F) is added into the system, the slump recovery can be done as long as the concrete is not setting. By adding superplasticizer as solution for recover unusable slump loss concrete may affects other concrete properties. Therefore, this paper was observed setting times and compressive strength of concrete after being re-dose with chemical admixture type F (superplasticizer, naphthalene based) for slump recovery. The concrete used in this study was fly ash concrete with fly ash replacement of 0%, 30% and 50% respectively. Concrete mix designed for test specimen was prepared with paste content (ratio of volume of cement to volume of void in the aggregate) of 1.2 and 1.3, water-to-binder ratio (w/b) range of 0.3 to 0.58, initial dose of superplasticizer (SP) range from 0.5 to 1.6%. The setting times of concrete were tested both before and after re-dosed with different amount of second dose and time of dosing. The research was concluded that addition of second dose of superplasticizer would increase both initial and final setting times accordingly to dosage of addition. As for fly ash concrete, the prolongation effect was higher as the replacement of fly ash increase. The prolongation effect can reach up to maximum about 4 hours. In case of compressive strength, the re-dosed concrete has strength fluctuation within acceptable range of ±10%.Keywords: Compressive strength, Fly ash concrete, Second dose of superplasticizer, Slump recovery, Setting times.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19352041 Why Do Pakistani Customers Patronize Islamic Banks- An Empirical Analysis
Authors: Farjana Mumu, Jia Guozho
Abstract:
Throughout the world, the Islamic way of banking and financing is increasing. The same trend is also visible in Pakistan, where the Islamic banking sector is increasing in size and volume each year. The question immediately arises as why the Pakistanis patronize the Islamic banking system? This study was carried out to find whether following the Islamic rules in finance is the main factor for such selection or whether other factors such as customer service, location, banking hour, physical facilities of the bank etc also have importance. The study was carried by distributing questionnaire and 200 responses were collected from the clients of Islamic banks. The result showed that the service quality and other factors are as important as following the Islamic rules for finance to retain old ustomers and catch new customers. The result is important and Islamic banks can take actions accordingly to look after both the factorsKeywords: Customers' perception, customer satisfaction, customer service, Islamic banking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22042040 Automatic Microaneurysm Quantification for Diabetic Retinopathy Screening
Authors: A. Sopharak, B. Uyyanonvara, S. Barman
Abstract:
Microaneurysm is a key indicator of diabetic retinopathy that can potentially cause damage to retina. Early detection and automatic quantification are the keys to prevent further damage. In this paper, which focuses on automatic microaneurysm detection in images acquired through non-dilated pupils, we present a series of experiments on feature selection and automatic microaneurysm pixel classification. We found that the best feature set is a combination of 10 features: the pixel-s intensity of shade corrected image, the pixel hue, the standard deviation of shade corrected image, DoG4, the area of the candidate MA, the perimeter of the candidate MA, the eccentricity of the candidate MA, the circularity of the candidate MA, the mean intensity of the candidate MA on shade corrected image and the ratio of the major axis length and minor length of the candidate MA. The overall sensitivity, specificity, precision, and accuracy are 84.82%, 99.99%, 89.01%, and 99.99%, respectively.
Keywords: Diabetic retinopathy, microaneurysm, naive Bayes classifier
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21982039 Simultaneous Term Structure Estimation of Hazard and Loss Given Default with a Statistical Model using Credit Rating and Financial Information
Authors: Tomohiro Ando, Satoshi Yamashita
Abstract:
The objective of this study is to propose a statistical modeling method which enables simultaneous term structure estimation of the risk-free interest rate, hazard and loss given default, incorporating the characteristics of the bond issuing company such as credit rating and financial information. A reduced form model is used for this purpose. Statistical techniques such as spline estimation and Bayesian information criterion are employed for parameter estimation and model selection. An empirical analysis is conducted using the information on the Japanese bond market data. Results of the empirical analysis confirm the usefulness of the proposed method.Keywords: Empirical Bayes, Hazard term structure, Loss given default.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16712038 Estimating an Optimal Neighborhood Size in the Spherical Self-Organizing Feature Map
Authors: Alexandros Leontitsis, Archana P. Sangole
Abstract:
This article presents a short discussion on optimum neighborhood size selection in a spherical selforganizing feature map (SOFM). A majority of the literature on the SOFMs have addressed the issue of selecting optimal learning parameters in the case of Cartesian topology SOFMs. However, the use of a Spherical SOFM suggested that the learning aspects of Cartesian topology SOFM are not directly translated. This article presents an approach on how to estimate the neighborhood size of a spherical SOFM based on the data. It adopts the L-curve criterion, previously suggested for choosing the regularization parameter on problems of linear equations where their right-hand-side is contaminated with noise. Simulation results are presented on two artificial 4D data sets of the coupled Hénon-Ikeda map.Keywords: Parameter estimation, self-organizing feature maps, spherical topology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15262037 Thermodynamic Analysis of Ventilated Façades under Operating Conditions in Southern Spain
Authors: Carlos A. D. Torres, Antonio D. Delgado
Abstract:
In this work we study the thermodynamic behavior of some ventilated facades under summer operating conditions in Southern Spain. Under these climatic conditions, indoor comfort implies a high energetic demand due to high temperatures that usually are reached in this season in the considered geographical area.
The aim of this work is to determine if during summer operating conditions in Southern Spain, ventilated façades provide some energy saving compared to the non-ventilated façades and to deduce their behavior patterns in terms of energy efficiency.
The modelization of the air flow in the channel has been performed by using Navier-Stokes equations for thermodynamic flows. Numerical simulations have been carried out with a 2D Finite Element approach.
This way, we analyze the behavior of ventilated façades under different weather conditions as variable wind, variable temperature and different levels of solar irradiation.
CFD computations show the combined effect of the shading of the external wall and the ventilation by the natural convection into the air gap achieve a reduction of the heat load during the summer period. This reduction has been evaluated by comparing the thermodynamic performances of two ventilated and two unventilated façades with the same geometry and thermophysical characteristics.
Keywords: Passive cooling, ventilated façades, energy-efficient building, CFD, FEM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49522036 Review of Trust Models in Wireless Sensor Networks
Authors: V. Uma Rani, K. Soma Sundaram
Abstract:
The major challenge faced by wireless sensor networks is security. Because of dynamic and collaborative nature of sensor networks the connected sensor devices makes the network unusable. To solve this issue, a trust model is required to find malicious, selfish and compromised insiders by evaluating trust worthiness sensors from the network. It supports the decision making processes in wireless sensor networks such as pre key-distribution, cluster head selection, data aggregation, routing and self reconfiguration of sensor nodes. This paper discussed the kinds of trust model, trust metrics used to address attacks by monitoring certain behavior of network. It describes the major design issues and their countermeasures of building trust model. It also discusses existing trust models used in various decision making process of wireless sensor networks.
Keywords: Attacks, Security, Trust, Trust model, Wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45742035 Terrain Evaluation Method for Hexapod Robot
Authors: Tomas Luneckas, Dainius Udris
Abstract:
In this paper a simple terrain evaluation method for hexapod robot is introduced. This method is based on feet coordinate evaluation when all are on the ground. Depending on the feet coordinate differences the local terrain evaluation is possible. Terrain evaluation is necessary for right gait selection and/or body position correction. For terrain roughness evaluation three planes are plotted: two of them as definition points use opposite feet coordinates, third coincides with the robot body plane. The leaning angle of body plane is evaluated measuring gravity force using three-axis accelerometer. Terrain roughness evaluation method is based on angle estimation between normal vectors of these planes. Aim of this work is to present a simple method for embedded robot controller, allowing to find the best further movement settings.Keywords: Hexapod robot, pose estimation, terrain evaluation, terrain roughness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18082034 Improving Survivability in Wireless Ad Hoc Network
Authors: Seyed Ali Sadat Noori, Elham Sahebi Bazaz
Abstract:
Topological changes in mobile ad hoc networks frequently render routing paths unusable. Such recurrent path failures have detrimental effects on quality of service. A suitable technique for eliminating this problem is to use multiple backup paths between the source and the destination in the network. This paper proposes an effective and efficient protocol for backup and disjoint path set in ad hoc wireless network. This protocol converges to a highly reliable path set very fast with no message exchange overhead. The paths selection according to this algorithm is beneficial for mobile ad hoc networks, since it produce a set of backup paths with more high reliability. Simulation experiments are conducted to evaluate the performance of our algorithm in terms of route numbers in the path set and its reliability. In order to acquire link reliability estimates, we use link expiration time (LET) between two nodes.Keywords: Wireless Ad Hoc Networks, Reliability, Routing, Disjoint Path
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16862033 Dempster-Shafer's Approach for Autonomous Virtual Agent Navigation in Virtual Environments
Authors: Jafreezal Jaafar, Eric McKenzie
Abstract:
This paper presents a solution for the behavioural animation of autonomous virtual agent navigation in virtual environments. We focus on using Dempster-Shafer-s Theory of Evidence in developing visual sensor for virtual agent. The role of the visual sensor is to capture the information about the virtual environment or identifie which part of an obstacle can be seen from the position of the virtual agent. This information is require for vitual agent to coordinate navigation in virtual environment. The virual agent uses fuzzy controller as a navigation system and Fuzzy α - level for the action selection method. The result clearly demonstrates the path produced is reasonably smooth even though there is some sharp turn and also still not diverted too far from the potential shortest path. This had indicated the benefit of our method, where more reliable and accurate paths produced during navigation task.
Keywords: Agent, navigation, Dempster Shafer, fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15282032 Comprehensive Risk Assessment Model in Agile Construction Environment
Authors: Jolanta Tamošaitienė
Abstract:
The article focuses on a developed comprehensive model to be used in an agile environment for the risk assessment and selection based on multi-attribute methods. The model is based on a multi-attribute evaluation of risk in construction, and the determination of their optimality criterion values are calculated using complex Multiple Criteria Decision-Making methods. The model may be further applied to risk assessment in an agile construction environment. The attributes of risk in a construction project are selected by applying the risk assessment condition to the construction sector, and the construction process efficiency in the construction industry accounts for the agile environment. The paper presents the comprehensive risk assessment model in an agile construction environment. It provides a background and a description of the proposed model and the developed analysis of the comprehensive risk assessment model in an agile construction environment with the criteria.
Keywords: Assessment, environment, agile, model, risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11062031 Impact of Health Sector Economic Reforms in Underdeveloped Countries
Authors: Haga Elimam
Abstract:
This paper investigates the connotation, and some of the realistic implications, of the economic reform of health sector in under developed countries. The paper investigates the issues that economic reforms have to address, and the policy targets they are considered to accomplish. The work argues that the development of economic reform is not connected only with understanding the priorities and refining them, furthermore with reformation and restructuring the organizations through which health policies are employed. Considering various organizational values, that are likely to be regular to all economic reform programs, a regulatory approach to institutional reform is unsuitable. The paper further investigates the selection of economic reform that may as well influence via technical suggestions and analysis, but the verdict to continue, and the consequent success of execution, eventually depends on the progressive political sustainability. The paper concludes by giving examples of institutional reforms from various underdeveloped countries and includes recommendation of the responsibility and control of donor organizations.Keywords: Economic Reform, Health Sector, underdeveloped Countries.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16982030 Performance Analysis of MT Evaluation Measures and Test Suites
Authors: Yao Jian-Min, Lv Qiang, Zhang Jing
Abstract:
Many measures have been proposed for machine translation evaluation (MTE) while little research has been done on the performance of MTE methods. This paper is an effort for MTE performance analysis. A general frame is proposed for the description of the MTE measure and the test suite, including whether the automatic measure is consistent with human evaluation, whether different results from various measures or test suites are consistent, whether the content of the test suite is suitable for performance evaluation, the degree of difficulty of the test suite and its influence on the MTE, the relationship of MTE result significance and the size of the test suite, etc. For a better clarification of the frame, several experiment results are analyzed relating human evaluation, BLEU evaluation, and typological MTE. A visualization method is introduced for better presentation of the results. The study aims for aid in construction of test suite and method selection in MTE practice.Keywords: Machine translation, natural language processing, visualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17102029 Oncogene Identification using Filter based Approaches between Various Cancer Types in Lung
Authors: Michael Netzer, Michael Seger, Mahesh Visvanathan, Bernhard Pfeifer, Gerald H. Lushington, Christian Baumgartner
Abstract:
Lung cancer accounts for the most cancer related deaths for men as well as for women. The identification of cancer associated genes and the related pathways are essential to provide an important possibility in the prevention of many types of cancer. In this work two filter approaches, namely the information gain and the biomarker identifier (BMI) are used for the identification of different types of small-cell and non-small-cell lung cancer. A new method to determine the BMI thresholds is proposed to prioritize genes (i.e., primary, secondary and tertiary) using a k-means clustering approach. Sets of key genes were identified that can be found in several pathways. It turned out that the modified BMI is well suited for microarray data and therefore BMI is proposed as a powerful tool for the search for new and so far undiscovered genes related to cancer.
Keywords: lung cancer, micro arrays, data mining, feature selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17582028 Supergrid Modeling and Operation and Control of Multi Terminal DC Grids for the Deployment of a Meshed HVDC Grid in South Asia
Authors: Farhan Beg, Raymond Moberly
Abstract:
The Indian subcontinent is facing a massive challenge with regards to energy security in its member countries; to provide reliable electricity to facilitate development across various sectors of the economy and consequently achieve the developmental targets. The instability of the current precarious situation is observable in the frequent system failures and blackouts.
The deployment of interconnected electricity ‘Supergrid’ designed to carry huge quanta of power across the Indian sub-continent is proposed in this paper. Not only enabling energy security in the subcontinent it will also provide a platform for Renewable Energy Sources (RES) integration. This paper assesses the need and conditions for a Supergrid deployment and consequently proposes a meshed topology based on Voltage Source High Voltage Direct Current (VSC- HVDC) converters for the Supergrid modeling. Various control schemes for the control of voltage and power are utilized for the regulation of the network parameters. A 3 terminal Multi Terminal Direct Current (MTDC) network is used for the simulations.
Keywords: Super grid, Wind and Solar energy, High Voltage Direct Current, Electricity management, Load Flow Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28142027 Improving Performance of World Wide Web by Adaptive Web Traffic Reduction
Authors: Achuthsankar S. Nair, J. S. Jayasudha
Abstract:
The ever increasing use of World Wide Web in the existing network, results in poor performance. Several techniques have been developed for reducing web traffic by compressing the size of the file, saving the web pages at the client side, changing the burst nature of traffic into constant rate etc. No single method was adequate enough to access the document instantly through the Internet. In this paper, adaptive hybrid algorithms are developed for reducing web traffic. Intelligent agents are used for monitoring the web traffic. Depending upon the bandwidth usage, user-s preferences, server and browser capabilities, intelligent agents use the best techniques to achieve maximum traffic reduction. Web caching, compression, filtering, optimization of HTML tags, and traffic dispersion are incorporated into this adaptive selection. Using this new hybrid technique, latency is reduced to 20 – 60 % and cache hit ratio is increased 40 – 82 %.Keywords: Bandwidth, Congestion, Intelligent Agents, Prefetching, Web Caching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17482026 Integrating Security Indifference Curve to Formal Decision Evaluation
Authors: Anon Yantarasri, Yachai Limpiyakorn
Abstract:
Decisions are regularly made during a project or daily life. Some decisions are critical and have a direct impact on project or human success. Formal evaluation is thus required, especially for crucial decisions, to arrive at the optimal solution among alternatives to address issues. According to microeconomic theory, all people-s decisions can be modeled as indifference curves. The proposed approach supports formal analysis and decision by constructing indifference curve model from the previous experts- decision criteria. These knowledge embedded in the system can be reused or help naïve users select alternative solution of the similar problem. Moreover, the method is flexible to cope with unlimited number of factors influencing the decision-making. The preliminary experimental results of the alternative selection are accurately matched with the expert-s decisions.Keywords: Decision Analysis and Resolution, Indifference Curve, Multi-criteria Decision Making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16232025 Historical Landscape Affects Present Tree Density in Paddy Field
Authors: Ha T. Pham, Shuichi Miyagawa
Abstract:
Ongoing landscape transformation is one of the major causes behind disappearance of traditional landscapes, and lead to species and resource loss. Tree in paddy fields in the northeast of Thailand is one of those traditional landscapes. Using three different historical time layers, we acknowledged the severe deforestation and rapid urbanization happened in the region. Despite the general thinking of decline in tree density as consequences, the heterogeneous trend of changes in total tree density in three studied landscapes denied the hypothesis that number of trees in paddy field depend on the length of land use practice. On the other hand, due to selection of planting new trees on levees, existence of trees in paddy field now relies on their values for human use. Besides, changes in land use and landscape structure had a significant impact on decision of which tree density level is considered as suitable for the landscape.
Keywords: Aerial photographs, land use change, traditional landscape, tree in paddy fields.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18672024 A New Time-Frequency Speech Analysis Approach Based On Adaptive Fourier Decomposition
Authors: Liming Zhang
Abstract:
In this paper, a new adaptive Fourier decomposition (AFD) based time-frequency speech analysis approach is proposed. Given the fact that the fundamental frequency of speech signals often undergo fluctuation, the classical short-time Fourier transform (STFT) based spectrogram analysis suffers from the difficulty of window size selection. AFD is a newly developed signal decomposition theory. It is designed to deal with time-varying non-stationary signals. Its outstanding characteristic is to provide instantaneous frequency for each decomposed component, so the time-frequency analysis becomes easier. Experiments are conducted based on the sample sentence in TIMIT Acoustic-Phonetic Continuous Speech Corpus. The results show that the AFD based time-frequency distribution outperforms the STFT based one.
Keywords: Adaptive fourier decomposition, instantaneous frequency, speech analysis, time-frequency distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17272023 Driver Readiness in Autonomous Vehicle Take-Overs
Authors: Abdurrahman Arslanyilmaz, Salman Al Matouq, Durmus V. Doner
Abstract:
Level 3 autonomous vehicles are able to take full responsibility over the control of the vehicle unless a system boundary is reached or a system failure occurs, in which case, the driver is expected to take-over the control of the vehicle. While this happens, the driver is often not aware of the traffic situation or is engaged in a secondary task. Factors affecting the duration and quality of take-overs in these situations have included secondary task type and nature, traffic density, take-over request (TOR) time, and TOR warning type and modality. However, to the best of the authors’ knowledge, no prior study examined time buffer for TORs when a system failure occurs immediately before intersections. The first objective of this study is to investigate the effect of time buffer (3 and 7 seconds) on the duration and quality of take-overs when a system failure occurs just prior to intersections. In addition, eye-tracking has become one of the most popular methods to report what individuals view, in what order, for how long, and how often, and it has been utilized in driving simulations with various objectives. However, to the extent of authors’ knowledge, none has compared drivers’ eye gaze behavior in the two different time buffers in order to examine drivers’ attention and comprehension of salient information. The second objective is to understand the driver’s attentional focus on comprehension of salient traffic-related information presented on different parts of the dashboard and on the roads.Keywords: Autonomous vehicles, driving simulation, eye gaze, attention, comprehension, take-over duration, take-over quality, time buffer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8982022 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models
Authors: I. V. Pinto, M. R. Sooriyarachchi
Abstract:
It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.
Keywords: Goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, type-I error, penalized quasi-likelihood, power, quasi-likelihood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7352021 A Novel Pareto-Based Meta-Heuristic Algorithm to Optimize Multi-Facility Location-Allocation Problem
Authors: Vahid Hajipour, Samira V. Noshafagh, Reza Tavakkoli-Moghaddam
Abstract:
This article proposes a novel Pareto-based multiobjective meta-heuristic algorithm named non-dominated ranking genetic algorithm (NRGA) to solve multi-facility location-allocation problem. In NRGA, a fitness value representing rank is assigned to each individual of the population. Moreover, two features ranked based roulette wheel selection including select the fronts and choose solutions from the fronts, are utilized. The proposed solving methodology is validated using several examples taken from the specialized literature. The performance of our approach shows that NRGA algorithm is able to generate true and well distributed Pareto optimal solutions.
Keywords: Non-dominated ranking genetic algorithm, Pareto solutions, Multi-facility location-allocation problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172