Search results for: Abdallah Alakayleh
37 An Event Based Approach to Extract the Run Time Execution Path of BPEL Process for Monitoring QoS in the Cloud
Authors: Rima Grati, Khouloud Boukadi, Hanene Ben-Abdallah
Abstract:
Due to the dynamic nature of the Cloud, continuous monitoring of QoS requirements is necessary to manage the Cloud computing environment. The process of QoS monitoring and SLA violation detection consists of: collecting low and high level information pertinent to the service, analyzing the collected information, and taking corrective actions when SLA violations are detected. In this paper, we detail the architecture and the implementation of the first step of this process. More specifically, we propose an event-based approach to obtain run time information of services developed as BPEL processes. By catching particular events (i.e., the low level information), our approach recognizes the run-time execution path of a monitored service and uses the BPEL execution patterns to compute QoS of the composite service (i.e., the high level information).
Keywords: Monitoring of Web service composition, Cloud environment, Run-time extraction of execution path of BPEL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168536 Optical Multicast over OBS Networks: An Approach Based On Code-Words and Tunable Decoders
Authors: Maha Sliti, Walid Abdallah, Noureddine Boudriga
Abstract:
In the frame of this work, we present an optical multicasting approach based on optical code-words. Our approach associates, in the edge node, an optical code-word to a group multicast address. In the core node, a set of tunable decoders are used to send a traffic data to multiple destinations based on the received code-word. The use of code-words, which correspond to the combination of an input port and a set of output ports, allows the implementation of an optical switching matrix. At the reception of a burst, it will be delayed in an optical memory. And, the received optical code-word is split to a set of tunable optical decoders. When it matches a configured code-word, the delayed burst is switched to a set of output ports.
Keywords: Optical multicast, optical burst switching networks, optical code-words, tunable decoder, virtual optical memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176035 Optical Multicast over OBS Networks: An Approach Based On Code-Words and Tunable Decoders
Authors: Maha Sliti, Walid Abdallah, Noureddine Boudriga
Abstract:
In the frame of this work, we present an optical multicasting approach based on optical code-words. Our approach associates, in the edge node, an optical code-word to a group multicast address. In the core node, a set of tunable decoders are used to send a traffic data to multiple destinations based on the received code-word. The use of code-words, which correspond to the combination of an input port and a set of output ports, allows the implementation of an optical switching matrix. At the reception of a burst, it will be delayed in an optical memory. And, the received optical code-word is split to a set of tunable optical decoders. When it matches a configured code-word, the delayed burst is switched to a set of output ports.
Keywords: Optical multicast, optical burst switching networks, optical code-words, tunable decoder, virtual optical memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168434 Edge Detection in Digital Images Using Fuzzy Logic Technique
Authors: Abdallah A. Alshennawy, Ayman A. Aly
Abstract:
The fuzzy technique is an operator introduced in order to simulate at a mathematical level the compensatory behavior in process of decision making or subjective evaluation. The following paper introduces such operators on hand of computer vision application. In this paper a novel method based on fuzzy logic reasoning strategy is proposed for edge detection in digital images without determining the threshold value. The proposed approach begins by segmenting the images into regions using floating 3x3 binary matrix. The edge pixels are mapped to a range of values distinct from each other. The robustness of the proposed method results for different captured images are compared to those obtained with the linear Sobel operator. It is gave a permanent effect in the lines smoothness and straightness for the straight lines and good roundness for the curved lines. In the same time the corners get sharper and can be defined easily.Keywords: Fuzzy logic, Edge detection, Image processing, computer vision, Mechanical parts, Measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 476733 Effect of Hooked-End Steel Fibres Geometry on Pull-Out Behaviour of Ultra-High Performance Concrete
Authors: Sadoon Abdallah, Mizi Fan, Xiangming Zhou
Abstract:
In this study, a comprehensive approach has been adopted to examine in detail the effect of various hook geometries on bond-slip characteristics. Extensive single fibre pull-out tests on ultra-high performance matrix with three different W/B ratios and embedded lengths have been carried out. Test results showed that the mechanical deformation of fibre hook is the main mechanism governing the pull-out behaviour. Furthermore, the quantitative analyses have been completed to compare the hook design contribution of 3D, 4D and 5D fibres to assess overall pull-out behaviour. It was also revealed that there is a strong relationship between the magnitude of hook contribution and W/B ratio (i.e. matrix strength). Reducing the W/B ratio from 0.20 to 0.11 greatly optimizes the interfacial transition zone (ITZ) and enables better mobilization, straightening of the hook and results in bond-slip-hardening behaviour.
Keywords: Bond mechanisms, fibre-matrix interface, hook geometry, pullout behaviour and water to binder ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169032 A Self Adaptive Genetic Based Algorithm for the Identification and Elimination of Bad Data
Authors: A. A. Hossam-Eldin, E. N. Abdallah, M. S. El-Nozahy
Abstract:
The identification and elimination of bad measurements is one of the basic functions of a robust state estimator as bad data have the effect of corrupting the results of state estimation according to the popular weighted least squares method. However this is a difficult problem to handle especially when dealing with multiple errors from the interactive conforming type. In this paper, a self adaptive genetic based algorithm is proposed. The algorithm utilizes the results of the classical linearized normal residuals approach to tune the genetic operators thus instead of making a randomized search throughout the whole search space it is more likely to be a directed search thus the optimum solution is obtained at very early stages(maximum of 5 generations). The algorithm utilizes the accumulating databases of already computed cases to reduce the computational burden to minimum. Tests are conducted with reference to the standard IEEE test systems. Test results are very promising.Keywords: Bad Data, Genetic Algorithms, Linearized Normal residuals, Observability, Power System State Estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 134531 Capacitive Air Bubble Detector Operated at Different Frequencies for Application in Hemodialysis
Authors: Mawahib Gafare Abdalrahman Ahmed, Abdallah Belal Adam, John Ojur Dennis
Abstract:
Air bubbles have been detected in human circulation of end-stage renal disease patients who are treated by hemodialysis. The consequence of air embolism, air bubbles, is under recognized and usually overlooked in daily practice. This paper shows results of a capacitor based detection method that capable of detecting the presence of air bubbles in the blood stream in different frequencies. The method is based on a parallel plates capacitor made of platinum with an area of 1.5 cm2 and a distance between the two plates is 1cm. The dielectric material used in this capacitor is Dextran70 solution which mimics blood rheology. Simulations were carried out using RC circuit at two frequencies 30Hz and 3 kHz and results compared with experiments and theory. It is observed that by injecting air bubbles of different diameters into the device, there were significant changes in the capacitance of the capacitor. Furthermore, it is observed that the output voltage from the circuit increased with increasing air bubble diameter. These results demonstrate the feasibility of this approach in improving air bubble detection in Hemodialysis.Keywords: Air bubbles, Hemodialysis, Capacitor, Dextran70, Air bubbles diameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 324530 Predicting the Adsorptive Capacities of Biosolid as a Barrier in Soil to Remove Industrial Contaminants
Authors: Hakim Aguedal, Hafida Hentit, Abdallah Aziz, Djillali Rida Merouani, Abdelkader Iddou
Abstract:
The major environmental risk of soil pollution is the contamination of groundwater by infiltration of organic and inorganic pollutants which can cause a serious menace. To prevent this risk and to protect the groundwater, we proceeded in this study to test the reliability of a biosolid as barrier to prevent the migration of very dangerous pollutants as ‘Cadmium’ through the different soil layers. In this study, we tried to highlight the effect of several parameters such as: turbidity (different cycle of Hydration/Dehydration), rainfall, effect of initial Cd(II) concentration and the type of soil. These parameters allow us to find the most effective manner to integrate this barrier in the soil. From the results obtained, we found a significant effect of the barrier. Indeed, the recorded passing quantities are lowest for the highest rainfall; we noted also that the barrier has a better affinity towards higher concentrations; the most retained amounts of cadmium has been in the top layer of the two types of soil tested, while the lowest amounts of cadmium are recorded in the bottom layers of soils.Keywords: Adsorption of Cadmium, Barrier, Groundwater Pollution, Protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 134129 A Distance Function for Data with Missing Values and Its Application
Authors: Loai AbdAllah, Ilan Shimshoni
Abstract:
Missing values in data are common in real world applications. Since the performance of many data mining algorithms depend critically on it being given a good metric over the input space, we decided in this paper to define a distance function for unlabeled datasets with missing values. We use the Bhattacharyya distance, which measures the similarity of two probability distributions, to define our new distance function. According to this distance, the distance between two points without missing attributes values is simply the Mahalanobis distance. When on the other hand there is a missing value of one of the coordinates, the distance is computed according to the distribution of the missing coordinate. Our distance is general and can be used as part of any algorithm that computes the distance between data points. Because its performance depends strongly on the chosen distance measure, we opted for the k nearest neighbor classifier to evaluate its ability to accurately reflect object similarity. We experimented on standard numerical datasets from the UCI repository from different fields. On these datasets we simulated missing values and compared the performance of the kNN classifier using our distance to other three basic methods. Our experiments show that kNN using our distance function outperforms the kNN using other methods. Moreover, the runtime performance of our method is only slightly higher than the other methods.
Keywords: Missing values, Distance metric, Bhattacharyya distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 275028 Combining Fuzzy Logic and Neural Networks in Modeling Landfill Gas Production
Authors: Mohamed Abdallah, Mostafa Warith, Roberto Narbaitz, Emil Petriu, Kevin Kennedy
Abstract:
Heterogeneity of solid waste characteristics as well as the complex processes taking place within the landfill ecosystem motivated the implementation of soft computing methodologies such as artificial neural networks (ANN), fuzzy logic (FL), and their combination. The present work uses a hybrid ANN-FL model that employs knowledge-based FL to describe the process qualitatively and implements the learning algorithm of ANN to optimize model parameters. The model was developed to simulate and predict the landfill gas production at a given time based on operational parameters. The experimental data used were compiled from lab-scale experiment that involved various operating scenarios. The developed model was validated and statistically analyzed using F-test, linear regression between actual and predicted data, and mean squared error measures. Overall, the simulated landfill gas production rates demonstrated reasonable agreement with actual data. The discussion focused on the effect of the size of training datasets and number of training epochs.
Keywords: Adaptive neural fuzzy inference system (ANFIS), gas production, landfill
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 241227 Autonomous Flight Performance Improvement of Load-Carrying Unmanned Aerial Vehicles by Active Morphing
Authors: Tugrul Oktay, Mehmet Konar, Mohamed Abdallah Mohamed, Murat Aydin, Firat Sal, Murat Onay, Mustafa Soylak
Abstract:
In this paper, it is aimed to improve autonomous flight performance of a load-carrying (payload: 3 kg and total: 6kg) unmanned aerial vehicle (UAV) through active wing and horizontal tail active morphing and also integrated autopilot system parameters (i.e. P, I, D gains) and UAV parameters (i.e. extension ratios of wing and horizontal tail during flight) design. For this purpose, a loadcarrying UAV (i.e. ZANKA-II) is manufactured in Erciyes University, College of Aviation, Model Aircraft Laboratory is benefited. Optimum values of UAV parameters and autopilot parameters are obtained using a stochastic optimization method. Using this approach autonomous flight performance of UAV is substantially improved and also in some adverse weather conditions an opportunity for safe flight is satisfied. Active morphing and integrated design approach gives confidence, high performance and easy-utility request of UAV users.Keywords: Unmanned aerial vehicles, morphing, autopilots, autonomous performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 226826 Photogrammetry and GIS Integration for Archaeological Documentation of Ahl-Alkahf, Jordan
Authors: Rami Al-Ruzouq, Abdallah Al-Zoubi, Abdel-Rahman Abueladas, Petya Dimitrova
Abstract:
Protection and proper management of archaeological heritage are an essential process of studying and interpreting the generations present and future. Protecting the archaeological heritage is based upon multidiscipline professional collaboration. This study aims to gather data by different sources (Photogrammetry and Geographic Information System (GIS)) integrated for the purpose of documenting one the of significant archeological sites (Ahl-Alkahf, Jordan). 3D modeling deals with the actual image of the features, shapes and texture to represent reality as realistically as possible by using texture. The 3D coordinates that result of the photogrammetric adjustment procedures are used to create 3D-models of the study area. Adding Textures to the 3D-models surfaces gives a 'real world' appearance to the displayed models. GIS system combined all data, including boundary maps, indicating the location of archeological sites, transportation layer, digital elevation model and orthoimages. For realistic representation of the study area, 3D - GIS model prepared, where efficient generation, management and visualization of such special data can be achieved.
Keywords: Archaeology, close range photogrammetry, ortho-photo, 3D-GIS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 216225 Removal of Textile Dye from Industrial Wastewater by Natural and Modified Diatomite
Authors: Hakim Aguedal, Abdelkader Iddou, Abdallah Aziz, Djillali Reda Merouani, Ferhat Bensaleh, Saleh Bensadek
Abstract:
The textile industry produces high amount of colored effluent each year. The management or treatment of these discharges depends on the applied techniques. Adsorption is one of wastewater treatment techniques destined to treat this kind of pollution, and the performance and efficiency predominantly depend on the nature of the adsorbent used. Therefore, scientific research is directed towards the development of new materials using different physical and chemical treatments to improve their adsorption capacities. In the same perspective, we looked at the effect of the heat treatment on the effectiveness of diatomite, which is found in abundance in Algeria. The textile dye Orange Bezaktiv (SRL-150) which is used as organic pollutants in this study is provided by the textile company SOITEXHAM in Oran city (west Algeria). The effect of different physicochemical parameters on the adsorption of SRL-150 on natural and modified diatomite is studied, and the results of the kinetics and adsorption isotherms were modeled.
Keywords: Wastewater treatment, diatomite, adsorption, dye pollution, kinetic, Isotherm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161224 3D Shape Modelling of Left Ventricle: Towards Correlation of Myocardial Scintigraphy Data and Coronarography Result
Authors: A. Ben Abdallah, H. Essabbah, M. H. Bedoui
Abstract:
The myocardial sintigraphy is an imaging modality which provides functional informations. Whereas, coronarography modality gives useful informations about coronary arteries anatomy. In case of coronary artery disease (CAD), the coronarography can not determine precisely which moderate lesions (artery reduction between 50% and 70%), known as the “gray zone", are haemodynamicaly significant. In this paper, we aim to define the relationship between the location and the degree of the stenosis in coronary arteries and the observed perfusion on the myocardial scintigraphy. This allows us to model the impact evolution of these stenoses in order to justify a coronarography or to avoid it for patients suspected being in the gray zone. Our approach is decomposed in two steps. The first step consists in modelling a coronary artery bed and stenoses of different location and degree. The second step consists in modelling the left ventricle at stress and at rest using the sphercical harmonics model and myocardial scintigraphic data. We use the spherical harmonics descriptors to analyse left ventricle model deformation between stress and rest which permits us to conclude if ever an ischemia exists and to quantify it.
Keywords: Spherical harmonics model, vascular bed, 3D reconstruction, left ventricle, myocardial scintigraphy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179223 Resveratrol Incorporated Liposomes Prepared from Pegylated Phospholipids and Cholesterol
Authors: Mont Kumpugdee-Vollrath, Khaled Abdallah
Abstract:
Liposomes and pegylated liposomes were widely used as drug delivery system in pharmaceutical field since a long time. However, in the former time, polyethylene glycol (PEG) was connected into phospholipid after the liposomes were already prepared. In this paper, we intend to study the possibility of applying phospholipids which already connected with PEG and then they were used to prepare liposomes. The model drug resveratrol was used because it can be applied against different diseases. Cholesterol was applied to stabilize the membrane of liposomes. The thin film technique in a laboratory scale was a preparation method. The liposomes were then characterized by nanoparticle tracking analysis (NTA), photon correlation spectroscopy (PCS) and light microscopic techniques. The stable liposomes can be produced and the particle sizes after filtration were in nanometers. The 2- and 3-chains-PEG-phospholipid (PL) caused in smaller particle size than the 4-chains-PEG-PL. Liposomes from PL 90G and cholesterol were stable during storage at 8 °C of 56 days because the particle sizes measured by PCS were almost not changed. There was almost no leakage of resveratrol from liposomes PL 90G with cholesterol after diffusion test in dialysis tube for 28 days. All liposomes showed the sustained release during measuring time of 270 min. The maximum release amount of 16-20% was detected with liposomes from 2- and 3-chains-PEG-PL. The other liposomes gave max. release amount of resveratrol only of 10%. The release kinetic can be explained by Korsmeyer-Peppas equation.
Keywords: Liposome, NTA, resveratrol, pegylation, cholesterol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 106322 Non-Linear Numerical Modeling of the Interaction of Twin Tunnels-Structure
Authors: A. Bayoumi, M. Abdallah, F. Hage Chehade
Abstract:
Structures on the ground surface bear impact from the tunneling-induced settlement, especially when twin tunnels are constructed. The tunneling influence on the structure is considered as a critical issue based on the construction procedure and relative position of tunnels. Lebanon is suffering from a traffic phenomenon caused by the lack of transportation systems. After several traffic counts and geotechnical investigations in Beirut city, efforts aim for the construction of tunneling systems. In this paper, we present a non-linear numerical modeling of the effect of the twin tunnels constructions on the structures located at soil surface for a particular site in Beirut. A parametric study, which concerns the geometric configuration of tunnels, the distance between their centers, the construction order, and the position of the structure, is performed. The tunnel-soil-structure interaction is analyzed by using the non-linear finite element modeling software PLAXIS 2D. The results of the surface settlement and the bending moment of the structure reveal significant influence when the structure is moved away, especially in vertical aligned tunnels.Keywords: Bending moment, construction procedure, elastic modulus, relative position, soil, structure location, surface settlement, twin tunnels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145921 Modeling Sustainable Truck Rental Operations Using Closed-Loop Supply Chain Network
Authors: Khaled S. Abdallah, Abdel-Aziz M. Mohamed
Abstract:
Moving industries consume numerous resources and dispose masses of used packaging materials. Proper sorting, recycling and disposing the packaging materials is necessary to avoid a sever pollution disaster. This research paper presents a conceptual model to propose sustainable truck rental operations instead of the regular one. An optimization model was developed to select the locations of truck rental centers, collection sites, maintenance and repair sites, and identify the rental fees to be charged for all routes that maximize the total closed supply chain profits. Fixed costs of vehicle purchasing, costs of constructing collection centers and repair centers, as well as the fixed costs paid to use disposal and recycling centers are considered. Operating costs include the truck maintenance, repair costs as well as the cost of recycling and disposing the packing materials, and the costs of relocating the truck are presented in the model. A mixed integer model is developed followed by a simulation model to examine the factors affecting the operation of the model.Keywords: Modeling, truck rental, supply chains management, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 82020 Numerical Study of Steel Structures Responses to External Explosions
Authors: Mohammad Abdallah
Abstract:
Due to the constant increase in terrorist attacks, the research and engineering communities have given significant attention to building performance under explosions. This paper presents a methodology for studying and simulating the dynamic responses of steel structures during external detonations, particularly for accurately investigating the impact of incrementing charge weight on the members total behavior, resistance and failure. Prediction damage method was introduced to evaluate the damage level of the steel members based on five scenarios of explosions. Johnson–Cook strength and failure model have been used as well as ABAQUS finite element code to simulate the explicit dynamic analysis, and antecedent field tests were used to verify the acceptance and accuracy of the proposed material strength and failure model. Based on the structural response, evaluation criteria such as deflection, vertical displacement, drift index, and damage level; the obtained results show the vulnerability of steel columns and un-braced steel frames which are designed and optimized to carry dead and live load to resist and endure blast loading.
Keywords: Steel structure, blast load, terrorist attacks, charge weight, damage level.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77419 A Markov Chain Model for Load-Balancing Based and Service Based RAT Selection Algorithms in Heterogeneous Networks
Authors: Abdallah Al Sabbagh
Abstract:
Next Generation Wireless Network (NGWN) is expected to be a heterogeneous network which integrates all different Radio Access Technologies (RATs) through a common platform. A major challenge is how to allocate users to the most suitable RAT for them. An optimized solution can lead to maximize the efficient use of radio resources, achieve better performance for service providers and provide Quality of Service (QoS) with low costs to users. Currently, Radio Resource Management (RRM) is implemented efficiently for the RAT that it was developed. However, it is not suitable for a heterogeneous network. Common RRM (CRRM) was proposed to manage radio resource utilization in the heterogeneous network. This paper presents a user level Markov model for a three co-located RAT networks. The load-balancing based and service based CRRM algorithms have been studied using the presented Markov model. A comparison for the performance of load-balancing based and service based CRRM algorithms is studied in terms of traffic distribution, new call blocking probability, vertical handover (VHO) call dropping probability and throughput.Keywords: Heterogeneous Wireless Network, Markov chain model, load-balancing based and service based algorithm, CRRM algorithms, Beyond 3G network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 248518 Detecting Email Forgery using Random Forests and Naïve Bayes Classifiers
Authors: Emad E Abdallah, A.F. Otoom, ArwaSaqer, Ola Abu-Aisheh, Diana Omari, Ghadeer Salem
Abstract:
As emails communications have no consistent authentication procedure to ensure the authenticity, we present an investigation analysis approach for detecting forged emails based on Random Forests and Naïve Bays classifiers. Instead of investigating the email headers, we use the body content to extract a unique writing style for all the possible suspects. Our approach consists of four main steps: (1) The cybercrime investigator extract different effective features including structural, lexical, linguistic, and syntactic evidence from previous emails for all the possible suspects, (2) The extracted features vectors are normalized to increase the accuracy rate. (3) The normalized features are then used to train the learning engine, (4) upon receiving the anonymous email (M); we apply the feature extraction process to produce a feature vector. Finally, using the machine learning classifiers the email is assigned to one of the suspects- whose writing style closely matches M. Experimental results on real data sets show the improved performance of the proposed method and the ability of identifying the authors with a very limited number of features.Keywords: Digital investigation, cybercrimes, emails forensics, anonymous emails, writing style, and authorship analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 525317 Speaker Independent Quranic Recognizer Basedon Maximum Likelihood Linear Regression
Authors: Ehab Mourtaga, Ahmad Sharieh, Mousa Abdallah
Abstract:
An automatic speech recognition system for the formal Arabic language is needed. The Quran is the most formal spoken book in Arabic, it is spoken all over the world. In this research, an automatic speech recognizer for Quranic based speakerindependent was developed and tested. The system was developed based on the tri-phone Hidden Markov Model and Maximum Likelihood Linear Regression (MLLR). The MLLR computes a set of transformations which reduces the mismatch between an initial model set and the adaptation data. It uses the regression class tree, as well as, estimates a set of linear transformations for the mean and variance parameters of a Gaussian mixture HMM system. The 30th Chapter of the Quran, with five of the most famous readers of the Quran, was used for the training and testing of the data. The chapter includes about 2000 distinct words. The advantages of using the Quranic verses as the database in this developed recognizer are the uniqueness of the words and the high level of orderliness between verses. The level of accuracy from the tested data ranged 68 to 85%.Keywords: Hidden Markov Model (HMM), MaximumLikelihood Linear Regression (MLLR), Quran, Regression ClassTree, Speech Recognition, Speaker-independent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 191416 Effects of Blast Load on Historic Stone Masonry Buildings in Canada: A Review and Analytical Study
Authors: Abass Braimah, Maha Hussein Abdallah
Abstract:
The global ascendancy of terrorist attacks on building infrastructure with economic and heritage significance has increased awareness of the possibility of terrorism in Canada. Many structures in Canada that are at risk of terrorist attacks include government buildings, built many years ago of historic stone masonry construction. Although many researchers are investigating ways to retrofit masonry stone buildings to mitigate the effect of blast loadings, lack of knowledge on the dynamic behavior of historic stone masonry structures under blast loads makes it difficult to ascertain the effectiveness of the retrofitting techniques. This paper presents a review of open-source literature for the experimental and numerical stone masonry structures under blast loads. This review yielded very little information of the response of the historic stone masonry structures under blast loads. Thus, a comprehensive study is needed to understand the blast load effects on historic stone masonry buildings. The out-of-plane response of historic masonry structures to blast loads is investigated by using single-degree-of-freedom analysis. This approach presents equations that can be used effectively in the analysis of historic masonry walls to out-of-plane blast loading.
Keywords: Blast loads, historical buildings, masonry structure, single-degree-of-freedom analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45115 Investigating the Effect of Uncertainty on a LP Model of a Petrochemical Complex: Stability Analysis Approach
Authors: Abdallah Al-Shammari
Abstract:
This study discusses the effect of uncertainty on production levels of a petrochemical complex. Uncertainly or variations in some model parameters, such as prices, supply and demand of materials, can affect the optimality or the efficiency of any chemical process. For any petrochemical complex with many plants, there are many sources of uncertainty and frequent variations which require more attention. Many optimization approaches are proposed in the literature to incorporate uncertainty within the model in order to obtain a robust solution. In this work, a stability analysis approach is applied to a deterministic LP model of a petrochemical complex consists of ten plants to investigate the effect of such variations on the obtained optimal production levels. The proposed approach can determinate the allowable variation ranges of some parameters, mainly objective or RHS coefficients, before the system lose its optimality. Parameters with relatively narrow range of variations, i.e. stability limits, are classified as sensitive parameters or constraints that need accurate estimate or intensive monitoring. These stability limits offer easy-to-use information to the decision maker and help in understanding the interaction between some model parameters and deciding when the system need to be re-optimize. The study shows that maximum production of ethylene and the prices of intermediate products are the most sensitive factors that affect the stability of the optimum solutionKeywords: Linear programming, Petrochemicals, stability analysis, uncertainty
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195114 An Enhanced SAR-Based Tsunami Detection System
Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah
Abstract:
Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.
Keywords: Detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217313 A Comprehensive Survey on RAT Selection Algorithms for Heterogeneous Networks
Authors: Abdallah AL Sabbagh, Robin Braun, Mehran Abolhasan
Abstract:
Due to the coexistence of different Radio Access Technologies (RATs), Next Generation Wireless Networks (NGWN) are predicted to be heterogeneous in nature. The coexistence of different RATs requires a need for Common Radio Resource Management (CRRM) to support the provision of Quality of Service (QoS) and the efficient utilization of radio resources. RAT selection algorithms are part of the CRRM algorithms. Simply, their role is to verify if an incoming call will be suitable to fit into a heterogeneous wireless network, and to decide which of the available RATs is most suitable to fit the need of the incoming call and admit it. Guaranteeing the requirements of QoS for all accepted calls and at the same time being able to provide the most efficient utilization of the available radio resources is the goal of RAT selection algorithm. The normal call admission control algorithms are designed for homogeneous wireless networks and they do not provide a solution to fit a heterogeneous wireless network which represents the NGWN. Therefore, there is a need to develop RAT selection algorithm for heterogeneous wireless network. In this paper, we propose an approach for RAT selection which includes receiving different criteria, assessing and making decisions, then selecting the most suitable RAT for incoming calls. A comprehensive survey of different RAT selection algorithms for a heterogeneous wireless network is studied.Keywords: Heterogeneous Wireless Network, RAT selection algorithms, Next Generation Wireless Network (NGWN), Beyond 3G Network, Common Radio Resource Management (CRRM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202512 An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference
Authors: Ayman A. Aly, Abdallah A. Alshnnaway
Abstract:
The general idea behind the filter is to average a pixel using other pixel values from its neighborhood, but simultaneously to take care of important image structures such as edges. The main concern of the proposed filter is to distinguish between any variations of the captured digital image due to noise and due to image structure. The edges give the image the appearance depth and sharpness. A loss of edges makes the image appear blurred or unfocused. However, noise smoothing and edge enhancement are traditionally conflicting tasks. Since most noise filtering behaves like a low pass filter, the blurring of edges and loss of detail seems a natural consequence. Techniques to remedy this inherent conflict often encompass generation of new noise due to enhancement. In this work a new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of three stages. (1) Define fuzzy sets in the input space to computes a fuzzy derivative for eight different directions (2) construct a set of IFTHEN rules by to perform fuzzy smoothing according to contributions of neighboring pixel values and (3) define fuzzy sets in the output space to get the filtered and edged image. Experimental results are obtained to show the feasibility of the proposed approach with two dimensional objects.Keywords: Additive noise, edge preserving filtering, fuzzy image filtering, noise reduction, two dimensional mechanical images.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156611 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance
Authors: Loai AbdAllah, Mahmoud Kaiyal
Abstract:
Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.Keywords: Missing values, distance metric, Bhattacharyya distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78010 Investigating the Transformer Operating Conditions for Evaluating the Dielectric Response
Authors: Jalal M. Abdallah
Abstract:
This paper presents an experimental investigation of transformer dielectric response and solid insulation water content. The dielectric response was carried out on the base of Hybrid Frequency Dielectric Spectroscopy and Polarization Current measurements method (FDS &PC). The calculation of the water content in paper is based on the water content in oil and the obtained equilibrium curves. A reference measurements were performed at equilibrium conditions for water content in oil and paper of transformer at different stable temperatures (25, 50, 60 and 70°C) to prepare references to evaluate the insulation behavior at the not equilibrium conditions. Some measurements performed at the different simulated normal working modes of transformer operation at the same temperature where the equilibrium conditions. The obtained results show that when transformer temperature is mach more than the its ambient temperature, the transformer temperature decreases immediately after disconnecting the transformer from the network and this temperature reduction influences the transformer insulation condition in the measuring process. In addition to the oil temperature at the near places to the sensors, the temperature uniformity in transformer which can be changed by a big change in the load of transformer before the measuring time will influence the result. The investigations have shown that the extremely influence of the time between disconnecting the transformer and beginning the measurements on the results. And the online monitoring for water content in paper measurements, on the basis of the oil water content on line monitoring and the obtained equilibrium curves. The measurements where performed continuously and for about 50 days without any disconnection in the prepared the adiabatic room.Keywords: Conductivity, Moisture, Temperature, Oil-paperinsulation, Online monitoring, Water content in oil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26469 Power Transformers Insulation Material Investigations: Partial Discharge
Authors: Jalal M. Abdallah
Abstract:
There is a great problem in testing and investigations the reliability of different type of transformers insulation materials. It summarized in how to create and simulate the real conditions of working transformer and testing its insulation materials for Partial Discharge PD, typically as in the working mode. A lot of tests may give untrue results as the physical behavior of the insulation material differs under tests from its working condition. In this work, the real working conditions were simulated, and a large number of specimens have been tested. The investigations first stage, begin with choosing samples of different types of insulation materials (papers, pressboards, etc.). The second stage, the samples were dried in ovens at 105 C0and 0.01bar for 48 hours, and then impregnated with dried and gasless oil (the water content less than 6 ppm.) at 105 C0and 0.01bar for 48 hours, after so specimen cooling at room pressure and temperature for 24 hours. The third stage is investigating PD for the samples using ICM PD measuring device. After that, a continuous test on oil-impregnated insulation materials (paper, pressboards) was developed, and the phase resolved partial discharge pattern of PD signals was measured. The important of this work in providing the industrial sector with trusted high accurate measuring results based on real simulated working conditions. All the PD patterns (results) associated with a discharge produced in well-controlled laboratory condition. They compared with other previous and other laboratory results. In addition, the influence of different temperatures condition on the partial discharge activities was studied.
Keywords: Transformers, insulation materials, voids, partial discharge (PD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14308 Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area
Authors: Nassib Abdallah, Pierre Chauvet, Abd El Salam Hajjar, Bassam Daya
Abstract:
In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words.Keywords: Brain-computer interface, speech recognition, electroencephalography EEG, Wernicke area, artificial neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 917