Search results for: algorithm techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9588

Search results for: algorithm techniques

5988 Remote Sensing Application in Environmental Researches: Case Study of Iran Mangrove Forests Quantitative Assessment

Authors: Neda Orak, Mostafa Zarei

Abstract:

Environmental assessment is an important session in environment management. Since various methods and techniques have been produces and implemented. Remote sensing (RS) is widely used in many scientific and research fields such as geology, cartography, geography, agriculture, forestry, land use planning, environment, etc. It can show earth surface objects cyclical changes. Also, it can show earth phenomena limits on basis of electromagnetic reflectance changes and deviations records. The research has been done on mangrove forests assessment by RS techniques. Mangrove forests quantitative analysis in Basatin and Bidkhoon estuaries was the aim of this research. It has been done by Landsat satellite images from 1975- 2013 and match to ground control points. This part of mangroves are the last distribution in northern hemisphere. It can provide a good background to improve better management on this important ecosystem. Landsat has provided valuable images to earth changes detection to researchers. This research has used MSS, TM, +ETM, OLI sensors from 1975, 1990, 2000, 2003-2013. Changes had been studied after essential corrections such as fix errors, bands combination, georeferencing on 2012 images as basic image, by maximum likelihood and IPVI Index. It was done by supervised classification. 2004 google earth image and ground points by GPS (2010-2012) was used to compare satellite images obtained changes. Results showed mangrove area in bidkhoon was 1119072 m2 by GPS and 1231200 m2 by maximum likelihood supervised classification and 1317600 m2 by IPVI in 2012. Basatin areas is respectively: 466644 m2, 88200 m2, 63000 m2. Final results show forests have been declined naturally. It is due to human activities in Basatin. The defect was offset by planting in many years. Although the trend has been declining in recent years again. So, it mentioned satellite images have high ability to estimation all environmental processes. This research showed high correlation between images and indexes such as IPVI and NDVI with ground control points.

Keywords: IPVI index, Landsat sensor, maximum likelihood supervised classification, Nayband National Park

Procedia PDF Downloads 276
5987 Density-based Denoising of Point Cloud

Authors: Faisal Zaman, Ya Ping Wong, Boon Yian Ng

Abstract:

Point cloud source data for surface reconstruction is usually contaminated with noise and outliers. To overcome this, we present a novel approach using modified kernel density estimation (KDE) technique with bilateral filtering to remove noisy points and outliers. First we present a method for estimating optimal bandwidth of multivariate KDE using particle swarm optimization technique which ensures the robust performance of density estimation. Then we use mean-shift algorithm to find the local maxima of the density estimation which gives the centroid of the clusters. Then we compute the distance of a certain point from the centroid. Points belong to outliers then removed by automatic thresholding scheme which yields an accurate and economical point surface. The experimental results show that our approach comparably robust and efficient.

Keywords: point preprocessing, outlier removal, surface reconstruction, kernel density estimation

Procedia PDF Downloads 333
5986 A Multilevel Approach for Stroke Prediction Combining Risk Factors and Retinal Images

Authors: Jeena R. S., Sukesh Kumar A.

Abstract:

Stroke is one of the major reasons of adult disability and morbidity in many of the developing countries like India. Early diagnosis of stroke is essential for timely prevention and cure. Various conventional statistical methods and computational intelligent models have been developed for predicting the risk and outcome of stroke. This research work focuses on a multilevel approach for predicting the occurrence of stroke based on various risk factors and invasive techniques like retinal imaging. This risk prediction model can aid in clinical decision making and help patients to have an improved and reliable risk prediction.

Keywords: prediction, retinal imaging, risk factors, stroke

Procedia PDF Downloads 284
5985 New Approaches for the Handwritten Digit Image Features Extraction for Recognition

Authors: U. Ravi Babu, Mohd Mastan

Abstract:

The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.

Keywords: handwritten digit recognition, distance measure, MNIST database, image features

Procedia PDF Downloads 445
5984 Image Compression Using Block Power Method for SVD Decomposition

Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed

Abstract:

In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.

Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless

Procedia PDF Downloads 367
5983 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand

Authors: Leila Jafari, Viliam Makis

Abstract:

In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.

Keywords: condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand

Procedia PDF Downloads 452
5982 Induction Machine Bearing Failure Detection Using Advanced Signal Processing Methods

Authors: Abdelghani Chahmi

Abstract:

This article examines the detection and localization of faults in electrical systems, particularly those using asynchronous machines. First, the process of failure will be characterized, relevant symptoms will be defined and based on those processes and symptoms, a model of those malfunctions will be obtained. Second, the development of the diagnosis of the machine will be shown. As studies of malfunctions in electrical systems could only rely on a small amount of experimental data, it has been essential to provide ourselves with simulation tools which allowed us to characterize the faulty behavior. Fault detection uses signal processing techniques in known operating phases.

Keywords: induction motor, modeling, bearing damage, airgap eccentricity, torque variation

Procedia PDF Downloads 124
5981 Preventive Impact of Regional Analgesia on Chronic Neuropathic Pain After General Surgery

Authors: Beloulou Mohamed Lamine, Fedili Benamar, Meliani Walid, Chaid Dalila, Lamara Abdelhak

Abstract:

Introduction: Post-surgical chronic pain (PSCP) is a pathological condition with a rather complex etiopathogenesis that extensively involves sensitization processes and neuronal damage. The neuropathic component of these pains is almost always present, with variable expression depending on the type of surgery. Objective: To assess the presumed beneficial effect of Regional Anesthesia-Analgesia Techniques (RAAT) on the development of post-surgical chronic neuropathic pain (PSCNP) in various surgical procedures. Patients and Methods: A comparative study involving 510 patients distributed across five surgical models (mastectomy, thoracotomy, hernioplasty, cholecystectomy, and major abdominal-pelvic surgery) and randomized into two groups: Group A (240) receiving conventional postoperative analgesia and Group B (270) receiving balanced analgesia, including the implementation of a Regional Anesthesia-Analgesia Technique (RAAT). These patients were longitudinally followed over a 6-month period, with postsurgical chronic neuropathic pain (PSCNP) defined by a Neuropathic Pain Score DN2≥ 3. Comparative measurements through univariate and multivariable analyses were performed to identify associations between the development of PSCNP and certain predictive factors, including the presumed preventive impact (protective effect) of RAAT. Results: At the 6th month post-surgery, 419 patients were analyzed (Group A= 196 and Group B= 223). The incidence of PSCNP was 32.2% (n=135). Among these patients with chronic pain, the prevalence of neuropathic pain was 37.8% (95% CI: [29.6; 46.5]), with n=51/135. It was significantly lower in Group B compared to Group A, with respective percentages of 31.4% vs. 48.8% (p-value = 0.035). The most significant differences were observed in breast and thoracopulmonary surgeries. In a multiple regression analysis, two predictors of PSCNP were identified: the presence of preoperative pain at the surgical site as a risk factor (OR: 3.198; 95% CI [1.326; 7.714]) and RAAT as a protective factor (OR: 0.408; 95% CI [0.173; 0.961]). Conclusion: The neuropathic component of PSCNP can be observed in different types of surgeries. Regional analgesia included in a multimodal approach to postoperative pain management has proven to be effective for acute pain and seems to have a preventive impact on the development of PSCNP and its neuropathic nature, particularly in surgeries that are more prone to chronicization.

Keywords: post-surgical chronic pain, post-surgical chronic neuropathic pain, regional anesthesia-analgesia techniques, neuropathic pain score DN2, preventive impact

Procedia PDF Downloads 61
5980 Numerical Simulation of Rayleigh Benard Convection and Radiation Heat Transfer in Two-Dimensional Enclosure

Authors: Raoudha Chaabane, Faouzi Askri, Sassi Ben Nasrallah

Abstract:

A new numerical algorithm is developed to solve coupled convection-radiation heat transfer in a two dimensional enclosure. Radiative heat transfer in participating medium has been carried out using the control volume finite element method (CVFEM). The radiative transfer equations (RTE) are formulated for absorbing, emitting and scattering medium. The density, velocity and temperature fields are calculated using the two double population lattice Boltzmann equation (LBE). In order to test the efficiency of the developed method the Rayleigh Benard convection with and without radiative heat transfer is analyzed. The obtained results are validated against available works in literature and the proposed method is found to be efficient, accurate and numerically stable.

Keywords: participating media, LBM, CVFEM- radiation coupled with convection

Procedia PDF Downloads 390
5979 The Green Synthesis AgNPs from Basil Leaf Extract

Authors: Wanida Wonsawat

Abstract:

Bioreduction of silver nanoparticles (AgNPs) from silver ions (Ag+) using water extract of Thai basil leaf was successfully carried out. The basil leaf extract provided a reducing agent and stabilizing agent for a synthesis of metal nanoparticles. Silver nanoparticles received from cut and uncut basil leaf was compared. The resulting silver nanoparticles are characterized by UV-Vis spectroscopy. The maximum intensities of silver nanoparticle from cut and uncut basil leaf were 410 and 420, respectively. The techniques involved are simple, eco-friendly and rapid.

Keywords: basil leaves, silver nanoparticles, green synthesis, plant extract

Procedia PDF Downloads 566
5978 Understanding Complexity at Pre-Construction Stage in Project Planning of Construction Projects

Authors: Mehran Barani Shikhrobat, Roger Flanagan

Abstract:

The construction planning and scheduling based on using the current tools and techniques is resulted deterministic in nature (Gantt chart, CPM) or applying a very little probability of completion (PERT) for each task. However, every project embodies assumptions and influences and should start with a complete set of clearly defined goals and constraints that remain constant throughout the duration of the project. Construction planners continue to apply the traditional methods and tools of “hard” project management that were developed for “ideal projects,” neglecting the potential influence of complexity on the design and construction process. The aim of this research is to investigate the emergence and growth of complexity in project planning and to provide a model to consider the influence of complexity on the total project duration at the post-contract award pre-construction stage of a project. The literature review showed that complexity originates from different sources of environment, technical, and workflow interactions. They can be divided into two categories of complexity factors, first, project tasks, and second, project organisation management. Project tasks may originate from performance, lack of resources, or environmental changes for a specific task. Complexity factors that relate to organisation and management refer to workflow and interdependence of different parts. The literature review highlighted the ineffectiveness of traditional tools and techniques in planning for complexity. However, this research focus on understanding the fundamental causes of the complexity of construction projects were investigated through a questionnaire with industry experts. The results were used to develop a model that considers the core complexity factors and their interactions. System dynamics were used to investigate the model to consider the influence of complexity on project planning. Feedback from experts revealed 20 major complexity factors that impact project planning. The factors are divided into five categories known as core complexity factors. To understand the weight of each factor in comparison, the Analytical Hierarchy Process (AHP) analysis method is used. The comparison showed that externalities are ranked as the biggest influence across the complexity factors. The research underlines that there are many internal and external factors that impact project activities and the project overall. This research shows the importance of considering the influence of complexity on the project master plan undertaken at the post-contract award pre-construction phase of a project.

Keywords: project planning, project complexity measurement, planning uncertainty management, project risk management, strategic project scheduling

Procedia PDF Downloads 95
5977 Efficient Neural and Fuzzy Models for the Identification of Dynamical Systems

Authors: Aouiche Abdelaziz, Soudani Mouhamed Salah, Aouiche El Moundhe

Abstract:

The present paper addresses the utilization of Artificial Neural Networks (ANNs) and Fuzzy Inference Systems (FISs) for the identification and control of dynamical systems with some degree of uncertainty. Because ANNs and FISs have an inherent ability to approximate functions and to adapt to changes in input and parameters, they can be used to control systems too complex for linear controllers. In this work, we show how ANNs and FISs can be put in order to form nets that can learn from external data. In sequence, it is presented structures of inputs that can be used along with ANNs and FISs to model non-linear systems. Four systems were used to test the identification and control of the structures proposed. The results show the ANNs and FISs (Back Propagation Algorithm) used were efficient in modeling and controlling the non-linear plants.

Keywords: non-linear systems, fuzzy set Models, neural network, control law

Procedia PDF Downloads 194
5976 Traffic Signal Control Using Citizens’ Knowledge through the Wisdom of the Crowd

Authors: Aleksandar Jovanovic, Katarina Kukic, Ana Uzelac, Dusan Teodorovic

Abstract:

Wisdom of the Crowd (WoC) is a decentralized method that uses the collective intelligence of humans. Individual guesses may be far from the target, but when considered as a group, they converge on optimal solutions for a given problem. We will utilize WoC to address the challenge of controlling traffic lights within intersections from the streets of Kragujevac, Serbia. The problem at hand falls within the category of NP-hard problems. We will employ an algorithm that leverages the swarm intelligence of bees: Bee Colony Optimization (BCO). Data regarding traffic signal timing at a single intersection will be gathered from citizens through a survey. Results obtained in that manner will be compared to the BCO results for different traffic scenarios. We will use Vissim traffic simulation software as a tool to compare the performance of bees’ and humans’ collective intelligence.

Keywords: wisdom of the crowd, traffic signal control, combinatorial optimization, bee colony optimization

Procedia PDF Downloads 96
5975 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 220
5974 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life

Authors: Desplanches Maxime

Abstract:

Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.

Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression

Procedia PDF Downloads 54
5973 Reinforcement Learning the Born Rule from Photon Detection

Authors: Rodrigo S. Piera, Jailson Sales Ara´ujo, Gabriela B. Lemos, Matthew B. Weiss, John B. DeBrota, Gabriel H. Aguilar, Jacques L. Pienaar

Abstract:

The Born rule was historically viewed as an independent axiom of quantum mechanics until Gleason derived it in 1957 by assuming the Hilbert space structure of quantum measurements [1]. In subsequent decades there have been diverse proposals to derive the Born rule starting from even more basic assumptions [2]. In this work, we demonstrate that a simple reinforcement-learning algorithm, having no pre-programmed assumptions about quantum theory, will nevertheless converge to a behaviour pattern that accords with the Born rule, when tasked with predicting the output of a quantum optical implementation of a symmetric informationally-complete measurement (SIC). Our findings support a hypothesis due to QBism (the subjective Bayesian approach to quantum theory), which states that the Born rule can be thought of as a normative rule for making decisions in a quantum world [3].

Keywords: quantum Bayesianism, quantum theory, quantum information, quantum measurement

Procedia PDF Downloads 85
5972 Voltage Stability Assessment and Enhancement Using STATCOM -A Case Study

Authors: Puneet Chawla, Balwinder Singh

Abstract:

Recently, increased attention has been devoted to the voltage instability phenomenon in power systems. Many techniques have been proposed in the literature for evaluating and predicting voltage stability using steady state analysis methods. In this paper, P-V and Q-V curves have been generated for a 57 bus Patiala Rajpura circle of India. The power-flow program is developed in MATLAB using Newton-Raphson method. Using Q-V curves, the weakest bus of the power system and the maximum reactive power change permissible on that bus is calculated. STATCOMs are placed on the weakest bus to improve the voltage and hence voltage stability and also the power transmission capability of the line.

Keywords: voltage stability, reactive power, power flow, weakest bus, STATCOM

Procedia PDF Downloads 500
5971 Model Predictive Controller for Pasteurization Process

Authors: Tesfaye Alamirew Dessie

Abstract:

Our study focuses on developing a Model Predictive Controller (MPC) and evaluating it against a traditional PID for a pasteurization process. Utilizing system identification from the experimental data, the dynamics of the pasteurization process were calculated. Using best fit with data validation, residual, and stability analysis, the quality of several model architectures was evaluated. The validation data fit the auto-regressive with exogenous input (ARX322) model of the pasteurization process by roughly 80.37 percent. The ARX322 model structure was used to create MPC and PID control techniques. After comparing controller performance based on settling time, overshoot percentage, and stability analysis, it was found that MPC controllers outperform PID for those parameters.

Keywords: MPC, PID, ARX, pasteurization

Procedia PDF Downloads 145
5970 Estimation of Relative Subsidence of Collapsible Soils Using Electromagnetic Measurements

Authors: Henok Hailemariam, Frank Wuttke

Abstract:

Collapsible soils are weak soils that appear to be stable in their natural state, normally dry condition, but rapidly deform under saturation (wetting), thus generating large and unexpected settlements which often yield disastrous consequences for structures unwittingly built on such deposits. In this study, a prediction model for the relative subsidence of stressed collapsible soils based on dielectric permittivity measurement is presented. Unlike most existing methods for soil subsidence prediction, this model does not require moisture content as an input parameter, thus providing the opportunity to obtain accurate estimation of the relative subsidence of collapsible soils using dielectric measurement only. The prediction model is developed based on an existing relative subsidence prediction model (which is dependent on soil moisture condition) and an advanced theoretical frequency and temperature-dependent electromagnetic mixing equation (which effectively removes the moisture content dependence of the original relative subsidence prediction model). For large scale sub-surface soil exploration purposes, the spatial sub-surface soil dielectric data over wide areas and high depths of weak (collapsible) soil deposits can be obtained using non-destructive high frequency electromagnetic (HF-EM) measurement techniques such as ground penetrating radar (GPR). For laboratory or small scale in-situ measurements, techniques such as an open-ended coaxial line with widely applicable time domain reflectometry (TDR) or vector network analysers (VNAs) are usually employed to obtain the soil dielectric data. By using soil dielectric data obtained from small or large scale non-destructive HF-EM investigations, the new model can effectively predict the relative subsidence of weak soils without the need to extract samples for moisture content measurement. Some of the resulting benefits are the preservation of the undisturbed nature of the soil as well as a reduction in the investigation costs and analysis time in the identification of weak (problematic) soils. The accuracy of prediction of the presented model is assessed by conducting relative subsidence tests on a collapsible soil at various initial soil conditions and a good match between the model prediction and experimental results is obtained.

Keywords: collapsible soil, dielectric permittivity, moisture content, relative subsidence

Procedia PDF Downloads 340
5969 Nonlinear Free Surface Flow Simulations Using Smoothed Particle Hydrodynamics

Authors: Abdelraheem M. Aly, Minh Tuan Nguyen, Sang-Wook Lee

Abstract:

The incompressible smoothed particle hydrodynamics (ISPH) is used to simulate impact free surface flows. In the ISPH, pressure is evaluated by solving pressure Poisson equation using a semi-implicit algorithm based on the projection method. The current ISPH method is applied to simulate dam break flow over an inclined plane with different inclination angles. The effects of inclination angle in the velocity of wave front and pressure distribution is discussed. The impact of circular cylinder over water in tank has also been simulated using ISPH method. The computed pressures on the solid boundaries is studied and compared with the experimental results.

Keywords: incompressible smoothed particle hydrodynamics, free surface flow, inclined plane, water entry impact

Procedia PDF Downloads 387
5968 Analysis of Histogram Asymmetry for Waste Recognition

Authors: Janusz Bobulski, Kamila Pasternak

Abstract:

Despite many years of effort and research, the problem of waste management is still current. So far, no fully effective waste management system has been developed. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.

Keywords: waste management, environmental protection, image processing, computer vision

Procedia PDF Downloads 101
5967 Nanowire by Ac Electrodeposition Into Nanoporous Alumina Fabrication of High Aspect Ratio Metalic

Authors: M. Beyzaiea, S. Mohammadia

Abstract:

High aspect ratio metallic (silver, cobalt) nanowire arrays were fabricated using ac electrodeposition techniques into the nanoporous alumina template. The template with long pore dept fabricated by hard anodization (HA) and thinned for ac electrodeposition. Template preparation was done in short time by using HA technique and high speed thing process. The TEM and XRD investigation confirm the three dimensional nucleation growth mechanism of metallic nanowire inside the nanoporous alumina that fabricated by HA process.

Keywords: metallic, nanowire, nanoporous alumina, ac electrodeposition

Procedia PDF Downloads 260
5966 Fuzzy Logic Based Sliding Mode Controller for a New Soft Switching Boost Converter

Authors: Azam Salimi, Majid Delshad

Abstract:

This paper presents a modified design of a sliding mode controller based on fuzzy logic for a New ZVThigh step up DC-DC Converter . Here a proportional - integral (PI)-type current mode control is employed and a sliding mode controller is designed utilizing fuzzy algorithm. Sliding mode controller guarantees robustness against all variations and fuzzy logic helps to reduce chattering phenomenon due to sliding controller, in that way efficiency increases and error, voltage and current ripples decreases. The proposed system is simulated using MATLAB / SIMULINK. This model is tested under variations of input and reference voltages and it was found that in comparison with conventional sliding mode controllers they perform better.

Keywords: switching mode power supplies, DC-DC converters, sliding mode control, robustness, fuzzy control, current mode control, non-linear behavior

Procedia PDF Downloads 522
5965 Application of Low-order Modeling Techniques and Neural-Network Based Models for System Identification

Authors: Venkatesh Pulletikurthi, Karthik B. Ariyur, Luciano Castillo

Abstract:

The system identification from the turbulence wakes will lead to the tactical advantage to prepare and also, to predict the trajectory of the opponents’ movements. A low-order modeling technique, POD, is used to predict the object based on the wake pattern and compared with pre-trained image recognition neural network (NN) to classify the wake patterns into objects. It is demonstrated that low-order modeling, POD, is able to predict the objects better compared to pretrained NN by ~30%.

Keywords: the bluff body wakes, low-order modeling, neural network, system identification

Procedia PDF Downloads 164
5964 Cyclostationary Analysis of Polytime Coded Signals for LPI Radars

Authors: Metuku Shyamsunder, Kakarla Subbarao, P. Prasanna

Abstract:

In radars, an electromagnetic waveform is sent, and an echo of the same signal is received by the receiver. From this received signal, by extracting various parameters such as round trip delay, Doppler frequency it is possible to find distance, speed, altitude, etc. However, nowadays as the technology increases, intruders are intercepting transmitted signal as it reaches them, and they will be extracting the characteristics and trying to modify them. So there is a need to develop a system whose signal cannot be identified by no cooperative intercept receivers. That is why LPI radars came into existence. In this paper, a brief discussion on LPI radar and its modulation (polytime code (PT1)), detection (cyclostationary (DFSM & FAM) techniques such as DFSM, FAM are presented and compared with respect to computational complexity.

Keywords: LPI radar, polytime codes, cyclostationary DFSM, FAM

Procedia PDF Downloads 459
5963 Reliable Method for Estimating Rating Curves in the Natural Rivers

Authors: Arash Ahmadi, Amirreza Kavousizadeh, Sanaz Heidarzadeh

Abstract:

Stage-discharge curve is one of the conventional methods for continuous river flow measurement. In this paper, an innovative approach is proposed for predicting the stage-discharge relationship using the application of isovel contours. Using the proposed method, it is possible to estimate the stage-discharge curve in the whole section with only using discharge information from just one arbitrary water level. For this purpose, multivariate relationships are used to determine the mean velocity in a cross-section. The unknown exponents of the proposed relationship have been obtained by using the second version of the Strength Pareto Evolutionary Algorithm (SPEA2), and the appropriate equation was selected by applying the TOPSIS (Technique for Order Preferences by Similarity to an Ideal Solution) approach. Results showed a close agreement between the estimated and observed data in the different cross-sections.

Keywords: rating curves, SPEA2, natural rivers, bed roughness distribution

Procedia PDF Downloads 143
5962 Constructing Orthogonal De Bruijn and Kautz Sequences and Applications

Authors: Yaw-Ling Lin

Abstract:

A de Bruijn graph of order k is a graph whose vertices representing all length-k sequences with edges joining pairs of vertices whose sequences have maximum possible overlap (length k−1). Every Hamiltonian cycle of this graph defines a distinct, minimum length de Bruijn sequence containing all k-mers exactly once. A Kautz sequence is the minimal generating sequence so as the sequence of minimal length that produces all possible length-k sequences with the restriction that every two consecutive alphabets in the sequences must be different. A collection of de Bruijn/Kautz sequences are orthogonal if any two sequences are of maximally differ in sequence composition; that is, the maximum length of their common substring is k. In this paper, we discuss how such a collection of (maximal) orthogonal de Bruijn/Kautz sequences can be made and use the algorithm to build up a web application service for the synthesized DNA and other related biomolecular sequences.

Keywords: biomolecular sequence synthesis, de Bruijn sequences, Eulerian cycle, Hamiltonian cycle, Kautz sequences, orthogonal sequences

Procedia PDF Downloads 149
5961 The ‘Quartered Head Technique’: A Simple, Reliable Way of Maintaining Leg Length and Offset during Total Hip Arthroplasty

Authors: M. Haruna, O. O. Onafowokan, G. Holt, K. Anderson, R. G. Middleton

Abstract:

Background: Requirements for satisfactory outcomes following total hip arthroplasty (THA) include restoration of femoral offset, version, and leg length. Various techniques have been described for restoring these biomechanical parameters, with leg length restoration being the most predominantly described. We describe a “quartered head technique” (QHT) which uses a stepwise series of femoral head osteotomies to identify and preserve the centre of rotation of the femoral head during THA in order to ensure reconstruction of leg length, offset and stem version, such that hip biomechanics are restored as near to normal as possible. This study aims to identify whether using the QHT during hip arthroplasty effectively restores leg length and femoral offset to within acceptable parameters. Methods: A retrospective review of 206 hips was carried out, leaving 124 hips in the final analysis. Power analysis indicated a minimum of 37 patients required. All operations were performed using an anterolateral approach by a single surgeon. All femoral implants were cemented, collarless, polished double taper CPT® stems (Zimmer, Swindon, UK). Both cemented, and uncemented acetabular components were used (Zimmer, Swindon, UK). Leg length, version, and offset were assessed intra-operatively and reproduced using the QHT. Post-operative leg length and femoral offset were determined and compared with the contralateral native hip, and the difference was then calculated. For the determination of leg length discrepancy (LLD), we used the method described by Williamson & Reckling, which has been shown to be reproducible with a measurement error of ±1mm. As a reference, the inferior margin of the acetabular teardrop and the most prominent point of the lesser trochanter were used. A discrepancy of less than 6mm LLD was chosen as acceptable. All peri-operative radiographs were assessed by two independent observers. Results: The mean absolute post-operative difference in leg length from the contralateral leg was +3.58mm. 84% of patients (104/124) had LLD within ±6mm of the contralateral limb. The mean absolute post-operative difference in offset from contralateral leg was +3.88mm (range -15 to +9mm, median 3mm). 90% of patients (112/124) were within ±6mm offset of the contralateral limb. There was no statistical difference noted between observer measurements. Conclusion: The QHT provides a simple, inexpensive yet effective method of maintaining femoral leg length and offset during total hip arthroplasty. Combining this technique with pre-operative templating or other techniques described may enable surgeons to reduce even further the discrepancies between pre-operative state and post-operative outcome.

Keywords: leg length discrepancy, technical tip, total hip arthroplasty, operative technique

Procedia PDF Downloads 69
5960 Evaluation of the Impact of Functional Communication Training on Behaviors of Concern for Students at a Non-Maintained Special School

Authors: Kate Duggan

Abstract:

Introduction: Functional Communication Training (FCT) is an approach which aims to reduce behaviours of concern by teaching more effective ways to communicate. It requires identification of the function of the behaviour of concern, through gathering information from key stakeholders and completing observations of the individual’s behaviour including antecedents to, and consequences of the behaviour. Appropriate communicative alternatives are then identified and taught to the individual using systematic instruction techniques. Behaviours of concern demonstrated by individuals with autism spectrum conditions (ASC) frequently have a communication function. When contributing to positive behavior support plans, speech and language therapists and other professionals working with individuals with ASC need to identify alternative communicative behaviours which are equally reinforcing as the existing behaviours of concern. Successful implementation of FCT is dependent on an effective ‘response match’. The new way of communicating must be equally as effective as the behaviour previously used and require the same amount or less effort from the individual. It must also be understood by the communication partners the individual encounters and be appropriate to their communicative contexts. Method: Four case studies within a non-maintained special school environment were described and analysed. A response match framework was used to identify the effectiveness of functional communication training delivered by the student’s speech and language therapist, teacher and learning support assistants. The success of systematic instruction techniques used to develop new communicative behaviours was evaluated using the CODES framework. Findings: Functional communication training can be used as part of a positive behaviour support approach for students within this setting. All case studies reviewed demonstrated ‘response success’, in that the desired response was gained from the new communicative behaviour. Barriers to the successful embedding of new communicative behaviours were encountered. In some instances, the new communicative behaviour could not be consistently understood across all communication partners which reduced ‘response recognisability’. There was also evidence of increased physical or cognitive difficulty in employing the new communicative behaviour which reduced the ‘response effectivity’. Successful use of ‘thinning schedules of reinforcement’, taught students to tolerate a delay to reinforcement once the new communication behaviour was learned.

Keywords: augmentative and alternative communication, autism spectrum conditions, behaviours of concern, functional communication training

Procedia PDF Downloads 106
5959 Increasing the Competitiveness of Batik Products as a Ready-To-Wear Cash Material Through Patterned Batik Innovation with Quilting Technique, at Klampar Batik Tourism Village

Authors: Urip Wahyuningsih, Indarti, Yuhri Inang Prihatina

Abstract:

The current development of batik art has given rise to various batik industries. The emergence of the batik industry is in order to meet the needs of the increasing share of the batik fashion market. This gives rise to competitiveness between the batik industry to compete for a share of the existing batik clothing market. Conditions like this also occur in Klampar Pamekasan Maduira Village, as one of the Batik Tourism Villages in Indonesia, it must continue to improve by trying to maintain the characteristics of Klampar Pamekasan Madura batik fashion and must also always innovate so that it remains highly competitive so that it remains one of the places popular batik tourist destination. Ready-to-wear or ready-to-wear clothing is clothing that is mass produced and produced in various sizes and colors, which can be purchased directly and worn easily. Patterned batik cloth is basically batik cloth that has the pattern lines of the clothing parts arranged efficiently, so there is no need to bother designing the pattern layout of the clothing parts on the batik cloth to be cut. Quilting can be defined as the art of combining fabric materials of certain sizes and cuts to form unique motifs. Based on several things above, breakthrough production innovation is needed without abandoning the characteristic of Klampar Pamekasan Madura Batik as one of the Batik Tourism Villages in Indonesia. One innovation that can be done is creating ready-to-wear patterned batik clothing products using a quilting technique. The method used in this research is the Double Diamond Design Process method. This method is divided into 4 phases namely, discover (namely the stage of designing the theme of the ready-to-wear patterned batik fashion innovation concept using quilting techniques in the Batik Village, Klampar Village, Pamekasasan, Madura), define (determine the design summary and present challenges to the design), develop ( presents prototypes developed, tested, reviewed and refined) and deliver (selected designs are produced, pass final tests and are ready to be commercialized). The research produces patterned batik products that are ready to wear with quilting techniques that are validated by experts and accepted by the public.

Keywords: competitiveness, ready to wear, innovation, quilting, klampar batik vllage

Procedia PDF Downloads 37