Search results for: Testing techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3375

Search results for: Testing techniques

1125 Cr, Fe and Se Contents of the Turkish Black and Green Teas and the Effect of Lemon Addition

Authors: E. Moroydor Derun, A. S. Kipcak, O. Dere Ozdemir, M. B. Piskin

Abstract:

Tea is consumed by a big part of the world-s population. It has an enormous importance for the Turkish culture. Nearly it is brewed every morning and evening at the all houses. Also it is consumed with lemon wedge. Habitual drinking of tea infusions may significantly contribute to daily dietary requirements of elements. Different instrumental techniques are used for determination of these elements. But atomic and mass spectroscopic methods are preferred most. In these study chromium, iron and selenium contents after the hot water brewing of black and green tea were determined by Optical Emission Spectroscopy (ICP-OES). Furthermore, effect of lemon addition on chromium, iron and selenium concentration tea infusions is investigated. Results of the investigation showed that concentration of chromium, iron and selenium increased in black tea with lemon addition. On the other hand only selenium is increased with lemon addition in green tea. And iron concentration is not detected in green tea but its concentration is determined as 1.420 ppm after lemon addition.

Keywords: Black tea, green tea, ICP-OES, lemon

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
1124 MIMO Broadcast Scheduling for Weighted Sum-rate Maximization

Authors: Swadhin Kumar Mishra, Sidhartha Panda, C. Ardil

Abstract:

Multiple-Input-Multiple-Output (MIMO) is one of the most important communication techniques that allow wireless systems to achieve higher data rate. To overcome the practical difficulties in implementing Dirty Paper Coding (DPC), various suboptimal MIMO Broadcast (MIMO-BC) scheduling algorithms are employed which choose the best set of users among all the users. In this paper we discuss such a sub-optimal MIMO-BC scheduling algorithm which employs antenna selection at the receiver side. The channels for the users considered here are not Identical and Independent Distributed (IID) so that users at the receiver side do not get equal opportunity for communication. So we introduce a method of applying weights to channels of the users which are not IID in such a way that each of the users gets equal opportunity for communication. The effect of weights on overall sum-rate achieved by the system has been investigated and presented.

Keywords: Antenna selection, Identical and Independent Distributed (IID), Sum-rate capacity, Weighted sum rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1573
1123 Volterra Filtering Techniques for Removal of Gaussian and Mixed Gaussian-Impulse Noise

Authors: M. B. Meenavathi, K. Rajesh

Abstract:

In this paper, we propose a new class of Volterra series based filters for image enhancement and restoration. Generally the linear filters reduce the noise and cause blurring at the edges. Some nonlinear filters based on median operator or rank operator deal with only impulse noise and fail to cancel the most common Gaussian distributed noise. A class of second order Volterra filters is proposed to optimize the trade-off between noise removal and edge preservation. In this paper, we consider both the Gaussian and mixed Gaussian-impulse noise to test the robustness of the filter. Image enhancement and restoration results using the proposed Volterra filter are found to be superior to those obtained with standard linear and nonlinear filters.

Keywords: Gaussian noise, Image enhancement, Imagerestoration, Linear filters, Nonlinear filters, Volterra series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2711
1122 Metal-Based Anticancer Agents: In vitro DNA Binding, Cleavage and Cytotoxicity

Authors: Mala Nath, Nagamani Kompelli, Partha Roy, Snehasish Das

Abstract:

Two new metal-based anticancer chemotherapeutic agents, [(Ph2Sn)2(HGuO)2(phen)Cl2] 1 and [(Ph3Sn)(HGuO)(phen)]- Cl.CH3OH.H2O 2, were designed, prepared and characterized by analytical and spectral (IR, ESI-Mass, 1H, 13C and 119Sn NMR) techniques. The proposed geometry of Sn(IV) in 1 and 2 is distorted octahedral and distorted trigonal-bipyramidal, respectively. Both 1 and 2 exhibit potential cytotoxicity in vitro against MCF-7, HepG-2 and DU-145 cell lines. The intrinsic binding constant (Kb) values of 1 (2.33 × 105 M-1) and 2 (2.46 × 105 M-1) evaluated from UV-Visible absorption studies suggest non-classical electrostatic mode of interaction via phosphate backbone of DNA double helix. The Stern- Volmer quenching constant (Ksv) of 1 (9.74 × 105 M-1) and 2 (2.9 × 106 M-1) determined by fluorescence studies suggests the groove binding and intercalation mode for 1 and 2, respectively. Effective cleavage of pBR322 DNA is induced by 1.Their interaction with DNA of cancer cells may account for potency.

Keywords: Anticancer agents, DNA binding studies, NMR spectroscopy, organotin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2753
1121 One-Pot Facile Synthesis of N-Doped Graphene Synthesized from Paraphenylenediamine as Metal-Free Catalysts for the Oxygen Reduction Used for Alkaline Fuel Cells

Authors: Leila Samiee, Amir Yadegari, Saeedeh Tasharrofi

Abstract:

In the work presented here, nitrogen-doped graphene materials were synthesized and used as metal-free electrocatalysts for oxygen reduction reaction (ORR) under alkaline conditions. Paraphenylenediamine was used as N precursor. The N-doped graphene was synthesized under hydrothermal treatment at 200°C. All the materials have been characterized by X-ray diffraction (XRD), Fourier transform infrared spectroscopy (FTIR), Transmission electron microscopy (TEM) and X-ray photo-electron spectroscopy (XPS). Moreover, for electrochemical evaluation of samples, Rotating Disk electrode (RDE) and Cyclic Voltammetry techniques (CV) were employed. The resulting material exhibits an outstanding catalytic activity for the oxygen reduction reaction (ORR) as well as excellent resistance towards methanol crossover effects, indicating their promising potential as ORR electrocatalysts for alkaline fuel cells.

Keywords: Alkaline fuel cell, graphene, metal-free catalyst, paraphenylenediamine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714
1120 Single-Camera Basketball Tracker through Pose and Semantic Feature Fusion

Authors: Adrià Arbués-Sangüesa, Coloma Ballester, Gloria Haro

Abstract:

Tracking sports players is a widely challenging scenario, specially in single-feed videos recorded in tight courts, where cluttering and occlusions cannot be avoided. This paper presents an analysis of several geometric and semantic visual features to detect and track basketball players. An ablation study is carried out and then used to remark that a robust tracker can be built with Deep Learning features, without the need of extracting contextual ones, such as proximity or color similarity, nor applying camera stabilization techniques. The presented tracker consists of: (1) a detection step, which uses a pretrained deep learning model to estimate the players pose, followed by (2) a tracking step, which leverages pose and semantic information from the output of a convolutional layer in a VGG network. Its performance is analyzed in terms of MOTA over a basketball dataset with more than 10k instances.

Keywords: Basketball, deep learning, feature extraction, single-camera, tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668
1119 A Systematic Review on the Integration of Project Management with Organizational Flows

Authors: Maurício Covolan Rosito, Ricardo Melo Bastos

Abstract:

Software projects are very dynamic and require recurring adjustments of their project plans. These settings can be understood as reconfigurations in the schedule, in the resources allocation and other design elements. Yet, during the planning and execution of a software project, the integration of specific activities in the projects with the activities that take part in the organization-s common activity flow should be considered. This article presents the results from a systematic review of aspects related to software projects- dynamic reconfiguration emphasizing the integration of project management with the organizational flows. A series of studies was analyzed from the year 2000 to the present. The results of this work show that there is a diversity of techniques and strategies for dynamic reconfiguration of software projects-. However, few approaches consider the integration of software project activities with the activities that take part in the organization-s common workflow.

Keywords: Dynamic Reconfiguration, Organizational workflows, Project Management, Systematic Review

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712
1118 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: Diabetic retinopathy, fundus images, STARE, Gabor filter, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
1117 EFL Learners- Perceptions of Computer-Mediated Communication (CMC) to Facilitate Communication in a Foreign Language

Authors: Lin, Huifen, Fang, Yueh-chiu

Abstract:

This study explores perceptions of English as a Foreign Language (EFL) learners on using computer mediated communication technology in their learner of English. The data consists of observations of both synchronous and asynchronous communication participants engaged in for over a period of 4 months, which included online, and offline communication protocols, open-ended interviews and reflection papers composed by participants. Content analysis of interview data and the written documents listed above, as well as, member check and triangulation techniques are the major data analysis strategies. The findings suggest that participants generally do not benefit from computer-mediated communication in terms of its effect in learning a foreign language. Participants regarded the nature of CMC as artificial, or pseudo communication that did not aid their authentic communicational skills in English. The results of this study sheds lights on insufficient and inconclusive findings, which most quantitative CMC studies previously generated.

Keywords: computer-mediated communication, EFL, writing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549
1116 Preparation of Tender for Building Conservation Work: Current Practices in Malaysia

Authors: Q.Y. Lee, Y.M. Lim

Abstract:

Building conservation work generally involves complex and non-standard work different from new building construction processes. In preparing tenders for building conservation projects, therefore, the quantity surveyor must carefully consider the specificity of non-standard items and demarcate the scope of unique conservation work. While the quantity surveyor must appreciate the full range of works to prepare a good tender document, he typically manages many unfamiliar elements, including practical construction methods, restoration techniques and work sequences. Only by fulfilling the demanding requirements of building conservation work can the quantity surveyor enhance his professionalism an area of growing cultural value and economic importance. By discussing several issues crucial to tender preparations for building conservation projects in Malaysia, this paper seeks a deeper understanding of how quantity surveying can better standardize tender preparation work and more successfully manage building conservation processes.

Keywords: Conservation Works, Quantity Surveying Practice, Tender Preparation, Malaysia

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5205
1115 Optimization for Subcritical Water Extraction of Phenolic Compounds from Rambutan Peels

Authors: Nuttawan Yoswathana, M. N. Eshtiaghi

Abstract:

Rambutan is a tropical fruit which peel possesses antioxidant properties. This work was conducted to optimize extraction conditions of phenolic compounds from rambutan peel. Response surface methodology (RSM) was adopted to optimize subcritical water extraction (SWE) on temperature, extraction time and percent solvent mixture. The results demonstrated that the optimum conditions for SWE were as follows: temperature 160°C, extraction time 20min. and concentration of 50% ethanol. Comparison of the phenolic compounds from the rambutan peels in maceration 6h, soxhlet 4h, and SWE 20min., it indicated that total phenolic content (using Folin-Ciocalteu-s phenol reagent) was 26.42, 70.29, and 172.47mg of tannic acid equivalent (TAE) per g dry rambutan peel, respectively. The comparative study concluded that SWE was a promising technique for phenolic compounds extraction from rambutan peel, due to much more two times of conventional techniques and shorter extraction times.

Keywords: Subcritical water extraction, Rambutan peel, phenolic compounds, response surface methodology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3626
1114 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation

Authors: H. Khanfari, M. Johari Fard

Abstract:

Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.

Keywords: Carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761
1113 Waste Management, Strategies and Situation in South Africa: An Overview

Authors: Edison Muzenda, Freeman Ntuli, Tsietsi Jefrey Pilusa

Abstract:

This paper highlights some interesting facts on South African-s waste situation and management strategies, in particular the Integrated Waste Management. South Africa supports a waste hierarchy by promoting cleaner production, waste minimisation, reuse, recycling and waste treatment with disposal and remediation as the last preferred options in waste management. The drivers for waste management techniques are identified as increased demand for waste service provision; increased demand for waste minimisation; recycling and recovery; land use, physical and environmental limitations; and socio-economic and demographic factors. The South African government recognizes the importance of scientific research as outlined on the white paper on Integrated Pollution and Waste Management (IP and WM) (DEAT, 2000).

Keywords: Cleaner production, demographic factors, environmental quality, integrated waste management, hierarchy, recycling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3964
1112 Weight Functions for Signal Reconstruction Based On Level Crossings

Authors: Nagesha, G. Hemantha Kumar

Abstract:

Although the level crossing concept has been the subject of intensive investigation over the last few years, certain problems of great interest remain unsolved. One of these concern is distribution of threshold levels. This paper presents a new threshold level allocation schemes for level crossing based on nonuniform sampling. Intuitively, it is more reasonable if the information rich regions of the signal are sampled finer and those with sparse information are sampled coarser. To achieve this objective, we propose non-linear quantization functions which dynamically assign the number of quantization levels depending on the importance of the given amplitude range. Two new approaches to determine the importance of the given amplitude segment are presented. The proposed methods are based on exponential and logarithmic functions. Various aspects of proposed techniques are discussed and experimentally validated. Its efficacy is investigated by comparison with uniform sampling.

Keywords: speech signals, sampling, signal reconstruction, asynchronousdelta modulation, non-linear quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
1111 Parameters Influencing the Output Precision of a Lens-Lens Beam Generator Solar Concentrator

Authors: M. Tawfik, X. Tonnellier, C. Sansom

Abstract:

The Lens-Lens Beam Generator (LLBG) is a Fresnel-based optical concentrating technique which provides flexibility in selecting the solar receiver location compared to conventional techniques through generating a powerful concentrated collimated solar beam. In order to achieve that, two successive lenses are used and followed by a flat mirror. Hence the generated beam emerging from the LLBG has a high power flux which impinges on the target receiver, it is important to determine the precision of the system output. In this present work, mathematical investigation of different parameters affecting the precision of the output beam is carried out. These parameters include: Deflection in sun-facing lens and its holding arm, delay in updating the solar tracking system, and the flat mirror surface flatness. Moreover, relationships that describe the power lost due to the effect of each parameter are derived in this study.

Keywords: Fresnel lens, LLBG, solar concentrator, solar tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1124
1110 Payment Problems, Cash Flow and Profitability of Construction Project: A System Dynamics Model

Authors: Wenhua Hou, Xing Liu, Deqiang Chen

Abstract:

The ubiquitous payment problems within construction industry of China are notoriously hard to be resolved, thus lead to a series of impacts to the industry chain. Among of them, the most direct result is affecting the normal operation of contractors negatively. A wealth of research has already discussed reasons of the payment problems by introducing a number of possible improvement strategies. But the causalities of these problems are still far from harsh reality. In this paper, the authors propose a model for cash flow system of construction projects by introducing System Dynamics techniques to explore causal facets of the payment problem. The effects of payment arrears on both cash flow and profitability of project are simulated into four scenarios by using data from real projects. Simulating results show visible clues to help contractors quantitatively determining the consequences for the construction project that arise from payment delay.

Keywords: payment problems, cash flow, profitability, system dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2693
1109 Enhanced Approaches to Rectify the Noise, Illumination and Shadow Artifacts

Authors: M. Sankari, C. Meena

Abstract:

Enhancing the quality of two dimensional signals is one of the most important factors in the fields of video surveillance and computer vision. Usually in real-life video surveillance, false detection occurs due to the presence of random noise, illumination and shadow artifacts. The detection methods based on background subtraction faces several problems in accurately detecting objects in realistic environments: In this paper, we propose a noise removal algorithm using neighborhood comparison method with thresholding. The illumination variations correction is done in the detected foreground objects by using an amalgamation of techniques like homomorphic decomposition, curvelet transformation and gamma adjustment operator. Shadow is removed using chromaticity estimator with local relation estimator. Results are compared with the existing methods and prove as high robustness in the video surveillance.

Keywords: Chromaticity Estimator, Curvelet Transformation, Denoising, Gamma correction, Homomorphic, Neighborhood Assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
1108 Work Structuring and the Feasibility of Application to Construction Projects in Vietnam

Authors: Viet-Hung Nguyen, Luh-Maan Chang

Abstract:

Design should be viewed concurrently by three ways as transformation, flow and value generation. An innovative approach to solve design – related problems is described as the integrated product - process design. As a foundation for a formal framework consisting of organizing principles and techniques, Work Structuring has been developed to guide efforts in the integration that enhances the development of operation and process design in alignment with product design. Vietnam construction projects are facing many delays, and cost overruns caused mostly by design related problems. A better design management that integrates product and process design could resolve these problems. A questionnaire survey and in – depth interviews were used to investigate the feasibility of applying Work Structuring to construction projects in Vietnam. The purpose of this paper is to present the research results and to illustrate the possible problems and potential solutions when Work Structuring is implemented to construction projects in Vietnam.

Keywords: integrated product – process design, Work Structuring, construction projects, Vietnam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
1107 Achieving Fair Share Objectives via Goal-Oriented Parallel Computer Job Scheduling Policies

Authors: Sangsuree Vasupongayya

Abstract:

Fair share is one of the scheduling objectives supported on many production systems. However, fair share has been shown to cause performance problems for some users, especially the users with difficult jobs. This work is focusing on extending goaloriented parallel computer job scheduling policies to cover the fair share objective. Goal-oriented parallel computer job scheduling policies have been shown to achieve good scheduling performances when conflicting objectives are required. Goal-oriented policies achieve such good performance by using anytime combinatorial search techniques to find a good compromised schedule within a time limit. The experimental results show that the proposed goal-oriented parallel computer job scheduling policy (namely Tradeofffs( Tw:avgX)) achieves good scheduling performances and also provides good fair share performance.

Keywords: goal-oriented parallel job scheduling policies, fairshare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
1106 The Origin, Diffusion and a Comparison of Ordinary Differential Equations Numerical Solutions Used by SIR Model in Order to Predict SARS-CoV-2 in Nordic Countries

Authors: Gleda Kutrolli, Maksi Kutrolli, Etjon Meco

Abstract:

SARS-CoV-2 virus is currently one of the most infectious pathogens for humans. It started in China at the end of 2019 and now it is spread in all over the world. The origin and diffusion of the SARS-CoV-2 epidemic, is analysed based on the discussion of viral phylogeny theory. With the aim of understanding the spread of infection in the affected countries, it is crucial to modelize the spread of the virus and simulate its activity. In this paper, the prediction of coronavirus outbreak is done by using SIR model without vital dynamics, applying different numerical technique solving ordinary differential equations (ODEs). We find out that ABM and MRT methods perform better than other techniques and that the activity of the virus will decrease in April but it never cease (for some time the activity will remain low) and the next cycle will start in the middle July 2020 for Norway and Denmark, and October 2020 for Sweden, and September for Finland.

Keywords: Forecasting, ordinary differential equations, SARS-CoV-2 epidemic, SIR model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 523
1105 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1949
1104 Enhanced Performance of Fading Dispersive Channel Using Dynamic Frequency Hopping(DFH)

Authors: Walid M. Saad

Abstract:

techniques are examined to overcome the performance degradation caused by the channel dispersion using slow frequency hopping (SFH) with dynamic frequency hopping (DFH) pattern adaptation. In DFH systems, the frequency slots are selected by continuous quality monitoring of all frequencies available in a system and modification of hopping patterns for each individual link based on replacing slots which its signal to interference ratio (SIR) measurement is below a required threshold. Simulation results will show the improvements in BER obtained by DFH in comparison with matched frequency hopping (MFH), random frequency hopping (RFH) and multi-carrier code division multiple access (MC-CDMA) in multipath slowly fading dispersive channels using a generalized bandpass two-path transfer function model, and will show the improvement obtained according to the threshold selection.

Keywords: code division multiple access (CDMA), dynamic channel allocation (DCA), dynamic channel assignment, frequency hopping, matched frequency hopping (MFH).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1687
1103 Mean Shift-based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work, we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972
1102 Discriminant Analysis as a Function of Predictive Learning to Select Evolutionary Algorithms in Intelligent Transportation System

Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, Daniel Vélez-Díaz, Edith Olaco García

Abstract:

In this paper, we present the use of the discriminant analysis to select evolutionary algorithms that better solve instances of the vehicle routing problem with time windows. We use indicators as independent variables to obtain the classification criteria, and the best algorithm from the generic genetic algorithm (GA), random search (RS), steady-state genetic algorithm (SSGA), and sexual genetic algorithm (SXGA) as the dependent variable for the classification. The discriminant classification was trained with classic instances of the vehicle routing problem with time windows obtained from the Solomon benchmark. We obtained a classification of the discriminant analysis of 66.7%.

Keywords: Intelligent transportation systems, data-mining techniques, evolutionary algorithms, discriminant analysis, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
1101 Low-Cost Pre-Treatment of Pharmaceutical Wastewater

Authors: A. Abu-Safa, S. Abu-Salah, M. Mosa, S. Gharaibeh

Abstract:

Pharmaceutical industries and effluents of sewage treatment plants are the main sources of residual pharmaceuticals in water resources. These emergent pollutants may adversely impact the biophysical environment. Pharmaceutical industries often generate wastewater that changes in characteristics and quantity depending on the used manufacturing processes. Carbamazepine (CBZ), {5Hdibenzo [b,f]azepine-5-carboxamide, (C15H12N2O)}, is a significant non-biodegradable pharmaceutical contaminant in the Jordanian pharmaceutical wastewater, which is not removed by the activated sludge processes in treatment plants. Activated carbon may potentially remove that pollutant from effluents, but the high cost involved suggests that more attention should be given to the potential use of low-cost materials in order to reduce cost and environmental contamination. Powders of Jordanian non-metallic raw materials namely, Azraq Bentonite (AB), Kaolinite (K), and Zeolite (Zeo) were activated (acid and thermal treatment) and evaluated by removing CBZ. The results of batch and column techniques experiments showed around 46% and 67% removal of CBZ respectively.

Keywords: Azraq bentonite, carbamazepine, pharmaceutical wastewater, zeolite.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2679
1100 Design Optimization Methodology of CMOS Active Mixers for Multi-Standard Receivers

Authors: S. Douss, F. Touati, M. Loulou

Abstract:

A design flow of multi-standard down-conversion CMOS mixers for three modern standards: Global System Mobile, Digital Enhanced Cordless Telephone and Universal Mobile Telecommunication Systems is presented. Three active mixer-s structures are studied. The first is based on the Gilbert cell which gives a tolerable noise figure and linearity with a low conversion gain. The second and third structures use the current bleeding and charge injection techniques in order to increase the conversion gain. An improvement of about 2 dB of the conversion gain is achieved without a considerable degradation of the other characteristics. The models used for noise figure, conversion gain and IIP3 used are studied. This study describes the nature of trade-offs inherent in such structures and gives insights that help in identifying which structure is better for given conditions.

Keywords: Active mixer, Radio-frequency transceiver, Multistandardfront end, Gilbert cell, current bleeding, charge injection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2465
1099 Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB

Authors: Durgesh Kumar Ojha, Monica Subashini

Abstract:

The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.

Keywords: ECG Waveform, Peak Detection, Arrhythmia, Matlab.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11968
1098 Study and Analysis of Permeable Articulated Concrete Blocks Pavement: With Reference to Indian Context

Authors: Shrikant Charhate, Gayatri Deshpande

Abstract:

Permeable pavements have significant benefits like managing runoff, infiltration, and carrying traffic over conventional pavements in terms of sustainability and environmental impact. Some of the countries are using this technique, especially at locations where durability and other parameters are of importance in nature; however, sparse work has been done on this concept. In India, this is yet to be adopted. In this work, the progress in the characterization and development of Permeable Articulated Concrete Blocks (PACB) pavement design is described and discussed with reference to Indian conditions. The experimentation and in-depth analysis was carried out considering conditions like soil erosion, water logging, and dust which are significant challenges caused due to impermeability of pavement. Concrete blocks with size 16.5’’x 6.5’’x 7’’ consisting of arch shape (4’’) at beneath and ½” PVC holes for articulation were casted. These blocks were tested for flexural strength. The articulation process was done with nylon ropes forming series of concrete block system. The total spacing between the blocks was kept about 8 to 10% of total area. The hydraulic testing was carried out by placing the articulated blocks with the combination of layers of soil, geotextile, clean angular aggregate. This was done to see the percentage of seepage through the entire system. The experimental results showed that with the shape of concrete block the flexural strength achieved was beyond the permissible limit. Such blocks with the combination could be very useful innovation in Indian conditions and useful at various locations compared to the traditional blocks as an alternative for long term sustainability.

Keywords: Connections, geotextile, permeable ACB, pavements, stone base.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 854
1097 Design and Analysis of a Piezoelectric Linear Motor Based on Rigid Clamping

Authors: Chao Yi, Cunyue Lu, Lingwei Quan

Abstract:

Piezoelectric linear motors have the characteristics of great electromagnetic compatibility, high positioning accuracy, compact structure and no deceleration mechanism, which make it promising to applicate in micro-miniature precision drive systems. However, most piezoelectric motors are employed by flexible clamping, which has insufficient rigidity and is difficult to use in rapid positioning. Another problem is that this clamping method seriously affects the vibration efficiency of the vibrating unit. In order to solve these problems, this paper proposes a piezoelectric stack linear motor based on double-end rigid clamping. First, a piezoelectric linear motor with a length of only 35.5 mm is designed. This motor is mainly composed of a motor stator, a driving foot, a ceramic friction strip, a linear guide, a pre-tightening mechanism and a base. This structure is much simpler and smaller than most similar motors, and it is easy to assemble as well as to realize precise control. In addition, the properties of piezoelectric stack are reviewed and in order to obtain the elliptic motion trajectory of the driving head, a driving scheme of the longitudinal-shear composite stack is innovatively proposed. Finally, impedance analysis and speed performance testing were performed on the piezoelectric linear motor prototype. The motor can measure speed up to 25.5 mm/s under the excitation of signal voltage of 120 V and frequency of 390 Hz. The result shows that the proposed piezoelectric stacked linear motor obtains great performance. It can run smoothly in a large speed range, which is suitable for various precision control in medical images, aerospace, precision machinery and many other fields.

Keywords: Elliptical trajectory, linear motor, piezoelectric stack, rigid clamping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 682
1096 Evaluation Method for Information Security Levels of CIIP (Critical Information Infrastructure Protection)

Authors: Soon-Tai Park, Jong-Whoi Shin, Bog-Ki Min, Ik-Sub Lee, Gang-Shin Lee, Jae-Il Lee

Abstract:

As the information age matures, major social infrastructures such as communication, finance, military and energy, have become ever more dependent on information communication systems. And since these infrastructures are connected to the Internet, electronic intrusions such as hacking and viruses have become a new security threat. Especially, disturbance or neutralization of a major social infrastructure can result in extensive material damage and social disorder. To address this issue, many nations around the world are researching and developing various techniques and information security policies as a government-wide effort to protect their infrastructures from newly emerging threats. This paper proposes an evaluation method for information security levels of CIIP (Critical Information Infrastructure Protection), which can enhance the security level of critical information infrastructure by checking the current security status and establish security measures accordingly to protect infrastructures effectively.

Keywords: Information Security Evaluation Methodology, Critical Information Infrastructure Protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763