Search results for: Time Lag
903 Chose the Right Mutation Rate for Better Evolve Combinational Logic Circuits
Authors: Emanuele Stomeo, Tatiana Kalganova, Cyrille Lambert
Abstract:
Evolvable hardware (EHW) is a developing field that applies evolutionary algorithm (EA) to automatically design circuits, antennas, robot controllers etc. A lot of research has been done in this area and several different EAs have been introduced to tackle numerous problems, as scalability, evolvability etc. However every time a specific EA is chosen for solving a particular task, all its components, such as population size, initialization, selection mechanism, mutation rate, and genetic operators, should be selected in order to achieve the best results. In the last three decade the selection of the right parameters for the EA-s components for solving different “test-problems" has been investigated. In this paper the behaviour of mutation rate for designing logic circuits, which has not been done before, has been deeply analyzed. The mutation rate for an EHW system modifies the number of inputs of each logic gates, the functionality (for example from AND to NOR) and the connectivity between logic gates. The behaviour of the mutation has been analyzed based on the number of generations, genotype redundancy and number of logic gates for the evolved circuits. The experimental results found provide the behaviour of the mutation rate during evolution for the design and optimization of simple logic circuits. The experimental results propose the best mutation rate to be used for designing combinational logic circuits. The research presented is particular important for those who would like to implement a dynamic mutation rate inside the evolutionary algorithm for evolving digital circuits. The researches on the mutation rate during the last 40 years are also summarized.Keywords: Design of logic circuit, evolutionary computation, evolvable hardware, mutation rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693902 Optimization of Reaction Rate Parameters in Modeling of Heavy Paraffins Dehydrogenation
Authors: Leila Vafajoo, Farhad Khorasheh, Mehrnoosh Hamzezadeh Nakhjavani, Moslem Fattahi
Abstract:
In the present study, a procedure was developed to determine the optimum reaction rate constants in generalized Arrhenius form and optimized through the Nelder-Mead method. For this purpose, a comprehensive mathematical model of a fixed bed reactor for dehydrogenation of heavy paraffins over Pt–Sn/Al2O3 catalyst was developed. Utilizing appropriate kinetic rate expressions for the main dehydrogenation reaction as well as side reactions and catalyst deactivation, a detailed model for the radial flow reactor was obtained. The reactor model composed of a set of partial differential equations (PDE), ordinary differential equations (ODE) as well as algebraic equations all of which were solved numerically to determine variations in components- concentrations in term of mole percents as a function of time and reactor radius. It was demonstrated that most significant variations observed at the entrance of the bed and the initial olefin production obtained was rather high. The aforementioned method utilized a direct-search optimization algorithm along with the numerical solution of the governing differential equations. The usefulness and validity of the method was demonstrated by comparing the predicted values of the kinetic constants using the proposed method with a series of experimental values reported in the literature for different systems.Keywords: Dehydrogenation, Pt-Sn/Al2O3 Catalyst, Modeling, Nelder-Mead, Optimization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2748901 Fabrication and Characterization of Al2O3 Based Electrical Insulation Coatings Around SiC Fibers
Authors: S. Palaniyappan, P. K. Chennam, M. Trautmann, H. Ahmad, T. Mehner, T. Lampke, G. Wagner
Abstract:
In structural-health monitoring of fiber reinforced plastics (FRPs), every single inorganic fiber sensor that are integrated into the bulk material requires an electrical insulation around itself, when the surrounding reinforcing fibers are electrically conductive. This results in a more accurate data acquisition only from the sensor fiber without any electrical interventions. For this purpose, thin nano-films of aluminium oxide (Al2O3)-based electrical-insulation coatings have been fabricated around the Silicon Carbide (SiC) single fiber sensors through reactive DC magnetron sputtering technique. The sputtered coatings were amorphous in nature and the thickness of the coatings increased with an increase in the sputter time. Microstructural characterization of the coated fibers performed using scanning electron microscopy (SEM) confirmed a homogeneous circumferential coating with no detectable defects or cracks on the surface. X-ray diffraction (XRD) analyses of the as-sputtered and 2 hours annealed coatings (825 & 1125 ˚C) revealed the amorphous and crystalline phases of Al2O3 respectively. Raman spectroscopic analyses produced no characteristic bands of Al2O3, as the thickness of the films was in the nanometer (nm) range, which is too small to overcome the actual penetration depth of the laser used. In addition, the influence of the insulation coatings on the mechanical properties of the SiC sensor fibers has been analyzed.
Keywords: Al2O3 insulation coating, reactive sputtering, SiC single fiber sensor, single fiber tensile test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 914900 Influence of the Granular Mixture Properties on the Rheological Properties of Concrete: Yield Stress Determination Using Modified Chateau et al. Model
Authors: Rachid Zentar, Mokrane Bala, Pascal Boustingorry
Abstract:
The prediction of the rheological behavior of concrete is at the center of current concerns of the concrete industry for different reasons. The shortage of good quality standard materials combined with variable properties of available materials imposes to improve existing models to take into account these variations at the design stage of concrete. The main reasons for improving the predictive models are, of course, saving time and cost at the design stage as well as to optimize concrete performances. In this study, we will highlight the different properties of the granular mixtures that affect the rheological properties of concrete. Our objective is to identify the intrinsic parameters of the aggregates which make it possible to predict the yield stress of concrete. The work was done using two typologies of grains: crushed and rolled aggregates. The experimental results have shown that the rheology of concrete is improved by increasing the packing density of the granular mixture using rolled aggregates. The experimental program realized allowed to model the yield stress of concrete by a modified model of Chateau et al. through a dimensionless parameter following Krieger-Dougherty law. The modelling confirms that the yield stress of concrete depends not only on the properties of cement paste but also on the packing density of the granular skeleton and the shape of grains.
Keywords: Crushed aggregates, intrinsic viscosity, packing density, rolled aggregates, slump, yield stress of concrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 598899 Static and Dynamic Three-Dimensional Finite Element Analysis of Pelvic Bone
Authors: M. S. El-Asfoury, M. A. El-Hadek
Abstract:
The complex shape of the human pelvic bone was successfully imaged and modeled using finite element FE processing. The bone was subjected to quasi-static and dynamic loading conditions simulating the effect of both weight gain and impact. Loads varying between 500 – 2500 N (~50 – 250 Kg of weight) was used to simulate 3D quasi-static weight gain. Two different 3D dynamic analyses, body free fall at two different heights (1 and 2 m) and forced side impact at two different velocities (20 and 40 Km/hr) were also studied. The computed resulted stresses were compared for the four loading cases, where Von Misses stresses increases linearly with the weight gain increase under quasi-static loading. For the dynamic models, the Von Misses stress history behaviors were studied for the affected area and effected load with respect to time. The normalization Von Misses stresses with respect to the applied load were used for comparing the free fall and the forced impact load results. It was found that under the forced impact loading condition an over lapping behavior was noticed, where as for the free fall the normalized Von Misses stresses behavior was found to nonlinearly different. This phenomenon was explained through the energy dissipation concept. This study will help designers in different specialization in defining the weakest spots for designing different supporting systems.Keywords: Pelvic Bone, Static and Dynamic Analysis, Three- Dimensional Finite Element Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2142898 Twitter Sentiment Analysis during the Lockdown on New Zealand
Authors: Smah Doeban Almotiri
Abstract:
One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2021, until April 4, 2021. Natural language processing (NLP), which is a form of Artificial intelligent was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applied machine learning sentimental method such as Crystal Feel and extended the size of the sample tweet by using multiple tweets over a longer period of time.
Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 589897 Water Immersion Recovery for Swimmers in Hot Environments
Authors: Thanura Abeywardena
Abstract:
This study recognized the effectiveness of cold-water immersion recovery post short-term exhaustive exercise. The purpose of this study was to understand if 16-20 °C of cold-water immersion would be beneficial in a tropical environment to achieve an optimal recovery in sprint swim performance in comparison to 10-15 °C of water immersion. Two 100 m-sprint swim performance times were measured along with blood lactate (BLa), heart rate (HR) and rate of perceived exertion (RPE) in a 25 m swimming pool with full body head out horizontal water immersions of 10-15 °C, 16-20 °C and 29-32 °C (pool temperature) for 10 minutes followed by 5 minutes of seated passive rest outside; in between the two swim performances. 10 well-trained adult swimmers (5 male and 5 female) within the top twenty in the Sri Lankan nationals swimming championships in 100m Butterfly and Freestyle in the years 2020 & 2021 volunteered for this study. One-way ANOVA analysis (p < 0.05) suggested performance time, BLa and HR had no significant differences between the three conditions after the second sprint, however RPE was significantly different with p = 0.034 between 10-15 °C and 16-20 °C immersion conditions. The study suggested that the recovery post the two cold-water immersion conditions were similar in terms of performance and physiological factors however the 16-20 °C temperature had a better “feel good” factor post sprint 2. Further study is recommended as there was participant bias with the swimmers not reaching optimal levels in sprint 1. Therefore, they might have been possibly fully recovered before sprint 2 invalidating the physiological effect of recovery.
Keywords: Hydrotherapy, blood lactate, fatigue, recovery, sprint-performance, sprint-swimming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 253896 A Sensorless Robust Tracking Control of an Implantable Rotary Blood Pump for Heart Failure Patients
Authors: Mohsen A. Bakouri, Andrey V. Savkin, Abdul-Hakeem H. Alomari, Robert F. Salamonsen, Einly Lim, Nigel H. Lovell
Abstract:
Physiological control of a left ventricle assist device (LVAD) is generally a complicated task due to diverse operating environments and patient variability. In this work, a tracking control algorithm based on sliding mode and feed forward control for a class of discrete-time single input single output (SISO) nonlinear uncertain systems is presented. The controller was developed to track the reference trajectory to a set operating point without inducing suction in the ventricle. The controller regulates the estimated mean pulsatile flow Qp and mean pulsatility index of pump rotational speed PIω that was generated from a model of the assist device. We recall the principle of the sliding mode control theory then we combine the feed-forward control design with the sliding mode control technique to follow the reference trajectory. The uncertainty is replaced by its upper and lower boundary. The controller was tested in a computer simulation covering two scenarios (preload and ventricular contractility). The simulation results prove the effectiveness and the robustness of the proposed controller
Keywords: robust control system, discrete-sliding mode, left ventricularle assist devicse, pulsatility index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871895 Study of the Quality of Surface Water in the Upper Cheliff Basin
Authors: Touhari Fadhila, Mehaiguene Madjid, Meddi Mohamed
Abstract:
This work aims to assess the quality of water dams based on the monitoring of physical-chemical parameters by the National Agency of Water Resources (ANRH) for a period of 10 years (1999-2008). Quality sheets of surface water for the four dams in the region of upper Cheliff (Ghrib, Deurdeur, Harreza, and Ouled Mellouk) show a degradation of the quality (organic pollution expressed in COD and OM) over time. Indeed, the registered amount of COD often exceeds 50 mg/ l, and the OM exceeds 15 mg/l. This pollution is caused by discharges of wastewater and eutrophication. The waters of dams show a very high salinity (TDS = 2574 mg/l in 2008 for the waters of the dam Ghrib, standard = 1500 mg/l). The concentration of nitrogenous substances (NH4+, NO2-) in water is high in 2008 at Ouled Melloukdam. This pollution is caused by the oxidation of nitrogenous organic matter. On the other hand, we studied the relationship between the evolution of quality parameters and filling dams. We observed a decrease in the salinity and COD following an improvement of the filling state of dams, this resides in the dilution water through the contribution of rainwater. While increased levels of nitrates and phosphorus in the waters of four dams studied during the rainy season is compared to the dry period, this increase may be due to leaching from fertilizers used in agricultural soils situated in watersheds.
Keywords: Surface water quality, pollution, physical-chemical parameters, upper Cheliff basin.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911894 A Structural Constitutive Model for Viscoelastic Rheological Behavior of Human Saphenous Vein Using Experimental Assays
Authors: Rassoli Aisa, Abrishami Movahhed Arezu, Faturaee Nasser, Seddighi Amir Saeed, Shafigh Mohammad
Abstract:
Cardiovascular diseases are one of the most common causes of mortality in developed countries. Coronary artery abnormalities and carotid artery stenosis, also known as silent death, are among these diseases. One of the treatment methods for these diseases is to create a deviatory pathway to conduct blood into the heart through a bypass surgery. The saphenous vein is usually used in this surgery to create the deviatory pathway. Unfortunately, a re-surgery will be necessary after some years due to ignoring the disagreement of mechanical properties of graft tissue and/or applied prostheses with those of host tissue. The objective of the present study is to clarify the viscoelastic behavior of human saphenous tissue. The stress relaxation tests in circumferential and longitudinal direction were done in this vein by exerting 20% and 50% strains. Considering the stress relaxation curves obtained from stress relaxation tests and the coefficients of the standard solid model, it was demonstrated that the saphenous vein has a non-linear viscoelastic behavior. Thereafter, the fitting with Fung’s quasilinear viscoelastic (QLV) model was performed based on stress relaxation time curves. Finally, the coefficients of Fung’s QLV model, which models the behavior of saphenous tissue very well, were presented.
Keywords: Fung’s quasilinear viscoelastic (QLV) model, strain rate, stress relaxation test, uniaxial tensile test, viscoelastic behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 789893 Modelling Hydrological Time Series Using Wakeby Distribution
Authors: Ilaria Lucrezia Amerise
Abstract:
The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.Keywords: Generalized extreme values (GEV), likelihood estimation, precipitation data, Wakeby distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 675892 Granting Saudi Women the Right to Drive in the Eyes of Qatari Media
Authors: Rasha A. Salameh
Abstract:
This research attempts to evaluate the treatment provided by the Qatari media to the decision to allow Saudi women to drive, and then activate this decision after a few months, that is, within the time frame between September 26, 2017 until June 30, 2018. This is through asking several questions, including whether the political dispute between Qatar and Saudi Arabia has cast a shadow over this handling, and if these Qatari media handlings are used to criticize the Saudi regime for delaying this step. Here emerges one of the research hypotheses that says that the coverage did not have the required professionalism, due to the fact that the decision and its activation took place in light of the political stalemate between Qatar and the Kingdom of Saudi Arabia, which requires testing the media framing and agenda theories to know to what extent they apply to this case. The research dealt with a sample of five Qatari media read in this sample: Al-Jazeera Net, The New Arab Newspaper, Al-Sharq Newspaper, The Arab Newspaper, and Al-Watan Newspaper. The results showed that most of the authors who covered the decision to allow Saudi women to drive a car did not achieve a balance in their writing, and that almost half of them did not have objectivity, and this indicates the proof of the hypothesis that there is a defect in the professional competence in covering the decision to allow Saudi women to drive cars by means of Qatari media, and the researcher attributes this result to the political position between Qatar and Saudi Arabia, in addition to the fact that the Arab media in most of them are characterized by a low ceiling of freedom, and most of them are identical in their position with the position of the regime’s official view.
Keywords: Saudi women, stereotypes, hate speech, framing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 754891 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions
Authors: Hazem M. El-Bakry
Abstract:
In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.
Keywords: Boolean functions, simplification, Karnough map, implementation of logic functions, modular neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2070890 A Proposed Hybrid Color Image Compression Based on Fractal Coding with Quadtree and Discrete Cosine Transform
Authors: Shimal Das, Dibyendu Ghoshal
Abstract:
Fractal based digital image compression is a specific technique in the field of color image. The method is best suited for irregular shape of image like snow bobs, clouds, flame of fire; tree leaves images, depending on the fact that parts of an image often resemble with other parts of the same image. This technique has drawn much attention in recent years because of very high compression ratio that can be achieved. Hybrid scheme incorporating fractal compression and speedup techniques have achieved high compression ratio compared to pure fractal compression. Fractal image compression is a lossy compression method in which selfsimilarity nature of an image is used. This technique provides high compression ratio, less encoding time and fart decoding process. In this paper, fractal compression with quad tree and DCT is proposed to compress the color image. The proposed hybrid schemes require four phases to compress the color image. First: the image is segmented and Discrete Cosine Transform is applied to each block of the segmented image. Second: the block values are scanned in a zigzag manner to prevent zero co-efficient. Third: the resulting image is partitioned as fractals by quadtree approach. Fourth: the image is compressed using Run length encoding technique.
Keywords: Fractal coding, Discrete Cosine Transform, Iterated Function System (IFS), Affine Transformation, Run length encoding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572889 Seamless Flow of Voluminous Data in High Speed Network without Congestion Using Feedback Mechanism
Abstract:
Continuously growing needs for Internet applications that transmit massive amount of data have led to the emergence of high speed network. Data transfer must take place without any congestion and hence feedback parameters must be transferred from the receiver end to the sender end so as to restrict the sending rate in order to avoid congestion. Even though TCP tries to avoid congestion by restricting the sending rate and window size, it never announces the sender about the capacity of the data to be sent and also it reduces the window size by half at the time of congestion therefore resulting in the decrease of throughput, low utilization of the bandwidth and maximum delay. In this paper, XCP protocol is used and feedback parameters are calculated based on arrival rate, service rate, traffic rate and queue size and hence the receiver informs the sender about the throughput, capacity of the data to be sent and window size adjustment, resulting in no drastic decrease in window size, better increase in sending rate because of which there is a continuous flow of data without congestion. Therefore as a result of this, there is a maximum increase in throughput, high utilization of the bandwidth and minimum delay. The result of the proposed work is presented as a graph based on throughput, delay and window size. Thus in this paper, XCP protocol is well illustrated and the various parameters are thoroughly analyzed and adequately presented.Keywords: Bandwidth-Delay Product, Congestion Control, Congestion Window, TCP/IP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489888 A Detailed Experimental Study of the Springback Anisotropy of Three Metals using the Stretching-Bending Process
Authors: A. Soualem
Abstract:
Springback is a significant problem in the sheet metal forming process. When the tools are released after the stage of forming, the product springs out, because of the action of the internal stresses. In many cases the deviation of form is too large and the compensation of the springback is necessary. The precise prediction of the springback of product is increasingly significant for the design of the tools and for compensation because of the higher ratio of the yield stress to the elastic modulus. The main object in this paper was to study the effect of the anisotropy on the springback for three directions of rolling: 0°, 45° and 90°. At the same time, we highlighted the influence of three different metallic materials: Aluminum, Steel and Galvanized steel. The original of our purpose consist on tests which are ensured by adapting a U-type stretching-bending device on a tensile testing machine, where we studied and quantified the variation of the springback according to the direction of rolling. We also showed the role of lubrication in the reduction of the springback. Moreover, in this work, we have studied important characteristics in deep drawing process which is a springback. We have presented defaults that are showed in this process and many parameters influenced a springback. Finally, our results works lead us to understand the influence of grains orientation with different metallic materials on the springback and drawing some conclusions how to concept deep drawing tools. In addition, the conducted work represents a fundamental contribution in the discussion the industry application.Keywords: Deep-Drawing, Grains orientation, Laminate Tool, Springback.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090887 Face Recognition Using Principal Component Analysis, K-Means Clustering, and Convolutional Neural Network
Authors: Zukisa Nante, Wang Zenghui
Abstract:
Face recognition is the problem of identifying or recognizing individuals in an image. This paper investigates a possible method to bring a solution to this problem. The method proposes an amalgamation of Principal Component Analysis (PCA), K-Means clustering, and Convolutional Neural Network (CNN) for a face recognition system. It is trained and evaluated using the ORL dataset. This dataset consists of 400 different faces with 40 classes of 10 face images per class. Firstly, PCA enabled the usage of a smaller network. This reduces the training time of the CNN. Thus, we get rid of the redundancy and preserve the variance with a smaller number of coefficients. Secondly, the K-Means clustering model is trained using the compressed PCA obtained data which select the K-Means clustering centers with better characteristics. Lastly, the K-Means characteristics or features are an initial value of the CNN and act as input data. The accuracy and the performance of the proposed method were tested in comparison to other Face Recognition (FR) techniques namely PCA, Support Vector Machine (SVM), as well as K-Nearest Neighbour (kNN). During experimentation, the accuracy and the performance of our suggested method after 90 epochs achieved the highest performance: 99% accuracy F1-Score, 99% precision, and 99% recall in 463.934 seconds. It outperformed the PCA that obtained 97% and KNN with 84% during the conducted experiments. Therefore, this method proved to be efficient in identifying faces in the images.
Keywords: Face recognition, Principal Component Analysis, PCA, Convolutional Neural Network, CNN, Rectified Linear Unit, ReLU, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 513886 Response Delay Model: Bridging the Gap in Urban Fire Disaster Response System
Authors: Sulaiman Yunus
Abstract:
The need for modeling response to urban fire disaster cannot be over emphasized, as recurrent fire outbreaks have gutted most cities of the world. This necessitated the need for a prompt and efficient response system in order to mitigate the impact of the disaster. Promptness, as a function of time, is seen to be the fundamental determinant for efficiency of a response system and magnitude of a fire disaster. Delay, as a result of several factors, is one of the major determinants of promptgness of a response system and also the magnitude of a fire disaster. Response Delay Model (RDM) intends to bridge the gap in urban fire disaster response system through incorporating and synchronizing the delay moments in measuring the overall efficiency of a response system and determining the magnitude of a fire disaster. The model identified two delay moments (pre-notification and Intra-reflex sequence delay) that can be elastic and collectively plays a significant role in influencing the efficiency of a response system. Due to variation in the elasticity of the delay moments, the model provides for measuring the length of delays in order to arrive at a standard average delay moment for different parts of the world, putting into consideration geographic location, level of preparedness and awareness, technological advancement, socio-economic and environmental factors. It is recommended that participatory researches should be embarked on locally and globally to determine standard average delay moments within each phase of the system so as to enable determining the efficiency of response systems and predicting fire disaster magnitudes.
Keywords: Delay moment, fire disaster, reflex sequence, response, response delay moment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 734885 The Mediating Effect of MSMEs Export Performance between Technological Advancement Capabilities and Business Performance
Authors: Fawad Hussain, Mohammad Basir Bin Saud, Mohd Azwardi Md Isa
Abstract:
The aim of this study is to empirically investigate the mediating impact of export performance (EP) between technological advancement capabilities and business performance (BP) of Malaysian manufacturing micro, small and medium sized enterprises (MSME’s). Firm’s technological advancement resources are hypothesized as a platform to enhance both exports and BP of manufacturing MSMEs in Malaysia. This study is twofold, primary it has investigated that technological advancement capabilities helps to appreciates main performance measures noted in terms of EP and Secondly, it investigates that how efficiently and effectively technological advancement capabilities can contribute in overall Malaysian MSME’s BP. Smart PLS-3 statistical software is used to know the association between technological advancement capabilities, MSME’s EP and BP. In this study, the data was composed from Malaysian manufacturing MSME’s in east coast industrial zones known as the manufacturing hub of MSMEs. Seven hundred and fifty (750) questionnaires were distributed, but only 148 usable questionnaires are returned. The finding of this study indicated that technological advancement capabilities helps to strengthen the export in term of time and cost efficient and it plays a significant role in appreciating their BP. This study is helpful for small and medium enterprise owners who intend to expand their business overseas and though smart technological advancement resources they can achieve their business competitiveness and excellence both at local and international markets.Keywords: Technological advancement capabilities, export performance, business performance, small and medium manufacturing enterprises, Malaysia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769884 Recycling for Sustainability: Plant Growth Media from Coal Combustion Products, Biosolids and Compost
Authors: Sougata Bardhan, Yona Chen, Warren A. Dick
Abstract:
Generation of electricity from coal has increased over the years in the United States and around the world. Burning of coal results in annual production of upwards of 100 millions tons (United States only) of coal combustion products (CCPs). Only about a third of these products are being used to create new products while the remainder goes to landfills. Application of CCPs mixed with composted organic materials onto soil can improve the soil-s physico-chemical conditions and provide essential plant nutritients. Our objective was to create plant growth media utilizing CCPs and compost in way which maximizes the use of these products and, at the same time, maintain good plant growth. Media were formulated by adding composted organic matter (COM) to CCPs at ratios ranging from 2:8 to 8:2 (v/v). The quality of these media was evaluated by measuring their physical and chemical properties and their effect on plant growth. We tested the media by 1) measuring their physical and chemical properties and 2) the growth of three plant species in the experimental media: wheat (Triticum sativum), tomato (Lycopersicum esculentum) and marigold (Tagetes patula). We achieved significantly (p < 0.001) higher growth (7-130%) in the experimental media containing CCPs compared to a commercial mix. The experimental media supplied adequate plant nutrition as no fertilization was provided during the experiment. Based on the results, we recommend the use of CCPs and composts for the creation of plant growth media.Keywords: Coal ash, FGD gypsum, organic compost, and plant growth media.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948883 Electoral Mathematics and Asymmetrical Treatment to Political Parties: The Mexican Case
Authors: Verónica Arredondo, Miguel Martínez-Panero, Teresa Peña, Victoriano Ramírez
Abstract:
The Mexican Chamber of Deputies is composed of 500 representatives: 300 of them elected by relative majority and another 200 ones elected through proportional representation in five electoral clusters (constituencies) with 40 representatives each. In this mixed-member electoral system, the seats distribution of proportional representation is not independent of the election by relative majority, as it attempts to correct representation imbalances produced in single-member districts. This two-fold structure has been maintained in the successive electoral reforms carried out along the last three decades (eight from 1986 to 2014). In all of them, the election process of 200 seats becomes complex: Formulas in the Law are difficult to understand and to be interpreted. This paper analyzes the Mexican electoral system after the electoral reform of 2014, which was applied for the first time in 2015. The research focuses on contradictions and issues of applicability, in particular situations where seats allocation is affected by ambiguity in the law and where asymmetrical treatment of political parties arises. Due to these facts, a proposal of electoral reform will be presented. It is intended to be simpler, clearer, and more enduring than the current system. Furthermore, this model is more suitable for producing electoral outcomes free of contradictions and paradoxes. This approach would allow a fair treatment of political parties and as a result an improved opportunity to exercise democracy.
Keywords: Apportionment paradoxes, biproportional representation, electoral mathematics, electoral reform, Mexican electoral system, proportional representation, political asymmetry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1227882 Wildfires Assessed by Remote Sense Images and Burned Land Monitoring
Authors: M. C. Proença
Abstract:
The tools described in this paper enable the location of burned areas where took place the annihilation of natural habitats and establishes a baseline for major changes in forest ecosystems during recovery. Moreover, the result allows the follow up of the surface fuel loading, allowing the evaluation and guidance of restoration measures to remote areas by phased time planning. This case study implements the evaluation of burned areas that suffered successive wildfires in Portugal mainland during the summer of 2017, killing more than 60 people. The goal is to show that this evaluation can be done with remote sense data free of charges in a simple laptop, with open-source software, describing the not-so-simple methodology step by step, to make it accessible for local workers in the areas attained, where the availability of information is essential for the immediate planning of mitigation measures, such as restoring road access, allocate funds for the recovery of human dwellings and assess further needs for restoration of the ecological system. Wildfires also devastate forest ecosystems having a direct impact on vegetation cover and killing or driving away the animal population, besides loss of all crops in rural areas that are essential as local resources. The economic interests are also attained, as the pinewood burned becomes useless for the noblest applications, so its value decreases, and resin extraction ends for several years.
Keywords: Image processing, remote sensing, wildfires, burned areas, SENTINEL-2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590881 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service
Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.
Keywords: Medical image, QoS, simulated annealing, Tabu search, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 960880 Investigations on the Influence of Optimized Charge Air Cooling for a Diesel Passenger Car
Authors: Christian Doppler, Gernot Hirschl, Gerhard Zsiga
Abstract:
Starting in 2020, an EU-wide CO2-limitation of 95 g/km is scheduled for the average of an OEMs passenger car fleet. Taking that into consideration additional improvement measures of the Diesel cycle are necessary in order to reduce fuel consumption and emissions while boosting, or at the least, keeping performance values at the same time. The present article deals with the possibilities of an optimized air/water charge air cooler, also called iCAC (indirect Charge Air Cooler) for a Diesel passenger car amongst extreme-boundary conditions. In this context, the precise objective was to show the impact of improved intercooling with reference to the engine working process (fuel consumption and NOx-emissions). Several extremeboundaries - e.g. varying ambient temperatures or mountainous routes - that will become very important in the near future regarding RDE (Real Driving emissions) were subject of the investigation. With the introduction of RDE in 2017 (EU6c measure), the controversial NEDC (New European Driving Cycle) will belong to the past and the OEMs will have to avoid harmful emissions in any conceivable real life situation. This is certainly going to lead to optimization-measurements at the powertrain, which again is going to make the implementation of iCACs, presently solely used for the premium class, more and more attractive for compact class cars. The investigations showed a benefit in FC between 1 and 3% for the iCAC in real world conditions.
Keywords: Air/Water-Charge Air Cooler, Co-Simulation, Diesel Working Process, EURO VI Fuel Consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2905879 Controlling Water Temperature during the Electrocoagulation Process Using an Innovative Flow Column-Electrocoagulation Reactor
Authors: Khalid S. Hashim, Andy Shaw, Rafid Alkhaddar, Montserrat Ortoneda Pedrola
Abstract:
A flow column has been innovatively used in the design of a new electrocoagulation reactor (ECR1) that will reduce the temperature of water being treated; where the flow columns work as a radiator for the water being treated. In order to investigate the performance of ECR1 and compare it to that of traditional reactors; 600 mL water samples with an initial temperature of 350C were pumped continuously through these reactors for 30 min at current density of 1 mA/cm2. The temperature of water being treated was measured at 5 minutes intervals over a 30 minutes period using a thermometer. Additional experiments were commenced to investigate the effects of initial temperature (15-350C), water conductivity (0.15 – 1.2 S) and current density (0.5 -3 mA/cm2) on the performance of ECR1. The results obtained demonstrated that the ECR1, at a current density of 1 mA/cm2 and continuous flow model, reduced water temperature from 350C to the vicinity of 280C during the first 15 minutes and kept the same level till the end of the treatment time. While, the temperature increased from 28.1 to 29.80C and from 29.8 to 31.90C in the batch and the traditional continuous flow models respectively. In term of initial temperature, ECR1 maintained the temperature of water being treated within the range of 22 to 280C without the need for external cooling system even when the initial temperatures varied over a wide range (15 to 350C). The influent water conductivity was found to be a significant variable that affect the temperature. The desirable value of water conductivity is 0.6 S. However, it was found that the water temperature increased rapidly with a higher current density.Keywords: Water temperature, flow column, electrocoagulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2352878 Location Update Cost Analysis of Mobile IPv6 Protocols
Authors: Brahmjit Singh
Abstract:
Mobile IP has been developed to provide the continuous information network access to mobile users. In IP-based mobile networks, location management is an important component of mobility management. This management enables the system to track the location of mobile node between consecutive communications. It includes two important tasks- location update and call delivery. Location update is associated with signaling load. Frequent updates lead to degradation in the overall performance of the network and the underutilization of the resources. It is, therefore, required to devise the mechanism to minimize the update rate. Mobile IPv6 (MIPv6) and Hierarchical MIPv6 (HMIPv6) have been the potential candidates for deployments in mobile IP networks for mobility management. HMIPv6 through studies has been shown with better performance as compared to MIPv6. It reduces the signaling overhead traffic by making registration process local. In this paper, we present performance analysis of MIPv6 and HMIPv6 using an analytical model. Location update cost function is formulated based on fluid flow mobility model. The impact of cell residence time, cell residence probability and user-s mobility is investigated. Numerical results are obtained and presented in graphical form. It is shown that HMIPv6 outperforms MIPv6 for high mobility users only and for low mobility users; performance of both the schemes is almost equivalent to each other.Keywords: Wireless networks, Mobile IP networks, Mobility management, performance analysis, Handover.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1756877 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.
Keywords: Artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 922876 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments
Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein
Abstract:
Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.
Keywords: Virtual Reality, effective computing, effective VR, emotion-based effective physiological database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995875 A Novel SVM-Based OOK Detector in Low SNR Infrared Channels
Authors: J. P. Dubois, O. M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a recent class of statistical classification and regression techniques playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM is applied to an infrared (IR) binary communication system with different types of channel models including Ricean multipath fading and partially developed scattering channel with additive white Gaussian noise (AWGN) at the receiver. The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these channel stochastic models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to classical binary signal maximum likelihood detection using a matched filter driven by On-Off keying (OOK) modulation. We found that the performance of SVM is superior to that of the traditional optimal detection schemes used in statistical communication, especially for very low signal-to-noise ratio (SNR) ranges. For large SNR, the performance of the SVM is similar to that of the classical detectors. The implication of these results is that SVM can prove very beneficial to IR communication systems that notoriously suffer from low SNR at the cost of increased computational complexity.
Keywords: Least square-support vector machine, on-off keying, matched filter, maximum likelihood detector, wireless infrared communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955874 Linguistic Devices Reflecting Violence in Border–Provinces of Southern Thailand on the Front Page of Local and National Newspapers
Authors: Chanokporn Angsuviriya
Abstract:
The objective of the study is to analyze linguistic devices reflecting the violence in the south border provinces; namely Pattani, Yala, Narathiwat and Songkla on 1,344 front pages of three local newspapers; namely ChaoTai, Focus PhakTai and Samila Time and of two national newspapers, including ThaiRath and Matichon, between 2004 and 2005, and 2011 and 2012. The study shows that there are two important linguistic devices: 1) lexical choices consisting of the use of verbs describing violence, the use of quantitative words and the use of words naming someone who committed violent acts, and 2) metaphors consisting of “A VIOLENT PROBLEM IS HEAT”, “A VICTIM IS A LEAF”, and “A TERRORIST IS A DOG”. Comparing linguistic devices between two types of newspapers, national newspapers choose to use words more violently than local newspapers do. Moreover, they create more negative images of the south of Thailand by using stative verbs. In addition, in term of metaphors “A TERRORIST IS A FOX.” is only found in national newspapers. As regards naming terrorists “southern insurgents”, this noun phrase which is collectively called by national newspapers has strongly negative meaning. Moreover, “southern insurgents” have been perceived by the Thais in the whole country while “insurgents” that are not modified have been only used by local newspapers.
Keywords: Linguistic Devices, Local Newspapers, National Newspapers, Violence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1289