Search results for: cooking methods.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4059

Search results for: cooking methods.

2709 Inference of Stress-Strength Model for a Lomax Distribution

Authors: H. Panahi, S. Asadi

Abstract:

In this paper, the estimation of the stress-strength parameter R = P(Y < X), when X and Y are independent and both are Lomax distributions with the common scale parameters but different shape parameters is studied. The maximum likelihood estimator of R is derived. Assuming that the common scale parameter is known, the bayes estimator and exact confidence interval of R are discussed. Simulation study to investigate performance of the different proposed methods has been carried out.

Keywords: Stress-Strength model; maximum likelihoodestimator; Bayes estimator; Lomax distribution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
2708 Visualization of Searching and Sorting Algorithms

Authors: Bremananth R, Radhika.V, Thenmozhi.S

Abstract:

Sequences of execution of algorithms in an interactive manner using multimedia tools are employed in this paper. It helps to realize the concept of fundamentals of algorithms such as searching and sorting method in a simple manner. Visualization gains more attention than theoretical study and it is an easy way of learning process. We propose methods for finding runtime sequence of each algorithm in an interactive way and aims to overcome the drawbacks of the existing character systems. System illustrates each and every step clearly using text and animation. Comparisons of its time complexity have been carried out and results show that our approach provides better perceptive of algorithms.

Keywords: Algorithms, Searching, Sorting, Visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2111
2707 Kalman-s Shrinkage for Wavelet-Based Despeckling of SAR Images

Authors: Mario Mastriani, Alberto E. Giraldez

Abstract:

In this paper, a new probability density function (pdf) is proposed to model the statistics of wavelet coefficients, and a simple Kalman-s filter is derived from the new pdf using Bayesian estimation theory. Specifically, we decompose the speckled image into wavelet subbands, we apply the Kalman-s filter to the high subbands, and reconstruct a despeckled image from the modified detail coefficients. Experimental results demonstrate that our method compares favorably to several other despeckling methods on test synthetic aperture radar (SAR) images.

Keywords: Kalman's filter, shrinkage, speckle, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1605
2706 Energy Loss Reduction in Oil Refineries through Flare Gas Recovery Approaches

Authors: Majid Amidpour, Parisa Karimi, Marzieh Joda

Abstract:

For the last few years, release of burned undesirable by-products has become a challenging issue in oil industries. Flaring, as one of the main sources of air contamination, involves detrimental and long-lasting effects on human health and is considered a substantial reason for energy losses worldwide. This research involves studying the implications of two main flare gas recovery methods at three oil refineries, all in Iran as the case I, case II, and case III in which the production capacities are increasing respectively. In the proposed methods, flare gases are converted into more valuable products, before combustion by the flare networks. The first approach involves collecting, compressing and converting the flare gas to smokeless fuel which can be used in the fuel gas system of the refineries. The other scenario includes utilizing the flare gas as a feed into liquefied petroleum gas (LPG) production unit already established in the refineries. The processes of these scenarios are simulated, and the capital investment is calculated for each procedure. The cumulative profits of the scenarios are evaluated using Net Present Value method. Furthermore, the sensitivity analysis based on total propane and butane mole fraction is carried out to make a rational comparison for LPG production approach, and the results are illustrated for different mole fractions of propane and butane. As the mole fraction of propane and butane contained in LPG differs in summer and winter seasons, the results corresponding to LPG scenario are demonstrated for each season. The results of the simulations show that cumulative profit in fuel gas production scenario and LPG production rate increase with the capacity of the refineries. Moreover, the investment return time in LPG production method experiences a decline, followed by a rising trend with an increase in C3 and C4 content. The minimum value of time return occurs at propane and butane sum concentration values of 0.7, 0.6, and 0.7 in case I, II, and III, respectively. Based on comparison of the time of investment return and cumulative profit, fuel gas production is the superior scenario for three case studies.

Keywords: Flare gas reduction, liquefied petroleum gas, fuel gas, net present value method, sensitivity analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 770
2705 Control and Navigation with Knowledge Bases

Authors: Miloš Šeda, Tomáš Březina

Abstract:

In this paper, we focus on the use of knowledge bases in two different application areas – control of systems with unknown or strongly nonlinear models (i.e. hardly controllable by the classical methods), and robot motion planning in eight directions. The first one deals with fuzzy logic and the paper presents approaches for setting and aggregating the rules of a knowledge base. Te second one is concentrated on a case-based reasoning strategy for finding the path in a planar scene with obstacles.

Keywords: fuzzy controller, fuzzification, rule base, inference, defuzzification, genetic algorithm, neural network, case-based reasoning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
2704 Influence of Temperature Variations on Calibrated Cameras

Authors: Peter Podbreznik, Božidar Potocnik

Abstract:

The camera parameters are changed due to temperature variations, which directly influence calibrated cameras accuracy. Robustness of calibration methods were measured and their accuracy was tested. An error ratio due to camera parameters change with respect to total error originated during calibration process was determined. It pointed out that influence of temperature variations decrease by increasing distance of observed objects from cameras.

Keywords: camera calibration, perspective projection matrix, epipolar geometry, temperature variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857
2703 Fuzzy Types Clustering for Microarray Data

Authors: Seo Young Kim, Tai Myong Choi

Abstract:

The main goal of microarray experiments is to quantify the expression of every object on a slide as precisely as possible, with a further goal of clustering the objects. Recently, many studies have discussed clustering issues involving similar patterns of gene expression. This paper presents an application of fuzzy-type methods for clustering DNA microarray data that can be applied to typical comparisons. Clustering and analyses were performed on microarray and simulated data. The results show that fuzzy-possibility c-means clustering substantially improves the findings obtained by others.

Keywords: Clustering, microarray data, Fuzzy-type clustering, Validation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
2702 Preparation of Tempeh Spore Powder by Freeze Drying

Authors: Jaruwan Chutrtong, Tanakwan Bussabun

Abstract:

Study production of tempeh inoculums powder by freeze-drying comparison with dry at 50°C and the sun bask for developing efficient tempeh inoculums for tempeh producing. Rhizopus oligosporus in PDA slant cultures was incubated at 30oC for 3-5 days until spores and mycelium. Preparation spores suspension with sterilized water and then count the number of started spores. Fill spores suspension in Rice flour and soy flour, mixed with water (in the ratio 10: 7), which is steamed and sterilized at 121°C 15min. Incubated at room temperature for 4 days, count number of spores. Then take the progressive infection and full spore dough to dry at 50°C, sun bask, and lyophilize. Grind to powder. Then pack in plastic bags, stored at 5°C. To investigate quality of inoculums which use different methods, tempeh was fermented every 4 weeks for 24 weeks of the experiment. The result found that rice flour is not suitable to use as raw material in the production of powdered spores.  Fungi can growth rarely. Less number of spores and requires more time than soy flour. For drying method, lyophilization is the least possible time. Samples from this method are very hard and very dark and harder to grind than other methods. Drying at 50°C takes longer time than lyophilization but can also set time use for drying. Character of the dry samples is hard solid and brown color, but can be grinded easier. The sun drying takes the longest time, can’t determine the exact time. When the spore powder was used to fermented tempeh immediately, product has similar characters as which use spores that was fresh prepared. The tempeh has normal quality. When spore powder stored at low temperature, tempeh from storage spore in weeks 4, 8 and 12 is still normal. Time spending in production was close to the production of fresh spores. After storage  spores for 16 and 20 weeks, tempeh is still normal but growth and sporulation were take longer time than usual (about 6 hours). At 24 week storage, fungal growth is not good, made tempeh looks inferior to normal color, also smell and texture.

Keywords: Freeze drying, preparation, spore powder, tempeh.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3937
2701 High Resolution Methods Based On Rank Revealing Triangular Factorizations

Authors: M. Bouri, S. Bourennane

Abstract:

In this paper, we propose a novel method for subspace estimation used high resolution method without eigendecomposition where the sample Cross-Spectral Matrix (CSM) is replaced by upper triangular matrix obtained from LU factorization. This novel method decreases the computational complexity. The method relies on a recently published result on Rank-Revealing LU (RRLU) factorization. Simulation results demonstrates that the new algorithm outperform the Householder rank-revealing QR (RRQR) factorization method and the MUSIC in the low Signal to Noise Ratio (SNR) scenarios.

Keywords: Factorization, Localization, Matrix, Signalsubspace.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1358
2700 A New Approach to Signal Processing for DC-Electromagnetic Flowmeters

Authors: Michael Schukat

Abstract:

Electromagnetic flowmeters with DC excitation are used for a wide range of fluid measurement tasks, but are rarely found in dosing applications with short measurement cycles due to the achievable accuracy. This paper will identify a number of factors that influence the accuracy of this sensor type when used for short-term measurements. Based on these results a new signal-processing algorithm will be described that overcomes the identified problems to some extend. This new method allows principally a higher accuracy of electromagnetic flowmeters with DC excitation than traditional methods.

Keywords: Electromagnetic Flowmeter, Kalman Filter, ShortMeasurement Cycles, Signal Estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
2699 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Petar Penchev

Abstract:

The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.

Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13
2698 The Evaluation of Load-Bearing Capacity of the Planar CHS Joint Using Finite Modeling

Authors: Anežka Jurčíková, Miroslav Rosmanit

Abstract:

The subject of this paper is to verify the behavior of the truss-type CHS joint which is beyond the scope of use of the EN 1993-1-8. This is performed by using the numerical modeling in program ANSYS and the analytical methods recommended in the CIDECT publication. The recommendations for numerical modeling of such types of joints as well as for evaluation of load-bearing capacity of the joint are given in this paper. The results from both analytical and numerical models are compared.

Keywords: ANSYS, CHS joints, FEM, Lattice structure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
2697 Algorithm for Bleeding Determination Based On Object Recognition and Local Color Features in Capsule Endoscopy

Authors: Yong-Gyu Lee, Jin Hee Park, Youngdae Seo, Gilwon Yoon

Abstract:

Automatic determination of blood in less bright or noisy capsule endoscopic images is difficult due to low S/N ratio. Especially it may not be accurate to analyze these images due to the influence of external disturbance. Therefore, we proposed detection methods that are not dependent only on color bands. In locating bleeding regions, the identification of object outlines in the frame and features of their local colors were taken into consideration. The results showed that the capability of detecting bleeding was much improved.

Keywords: Endoscopy, object recognition, bleeding, image processing, RGB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
2696 Analysis of the Effect of HV Transmission Lines on the Control Room and its Proposed Shielding

Authors: Diako Azizi, Hosein Heydari, Ahmad Gholami

Abstract:

Today with the rapid growth of telecommunications equipment, electronic and developing more and more networks of power, influence of electromagnetic waves on one another has become hot topic discussions. So in this article, this issue and appropriate mechanisms for EMC operations have been presented. First, impact of high voltage lines on the surrounding environment especially on the control room has been investigated, then to reduce electromagnetic radiation, various methods of shielding are provided and shielding effectiveness of them has been compared. It should be expressed that simulations have been done by the finite element method (FEM).

Keywords: Electrical field, EMC, field distribution, finite element method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
2695 The Study on Migration Strategy of Legacy System

Authors: Chao Qi, Fuyang Peng, Bo Deng, Xiaoyan Su

Abstract:

In the upgrade process of enterprise information systems, whether new systems will be success and their development will be efficient, depends on how to deal with and utilize those legacy systems. We propose an evaluation system, which comprehensively describes the capacity of legacy information systems in five aspects. Then a practical legacy systems evaluation method is scripted. Base on the evaluation result, we put forward 4 kinds of migration strategy: eliminated, maintenance, modification, encapsulating. The methods and strategies play important roles in practice.

Keywords: Legacy Systems, Evaluation Method, Migration Strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
2694 Qualitative and Quantitative Case Study Research Method on Social Science: Accounting Perspective

Authors: Bubaker F. Shareia

Abstract:

The main aim of this paper is to set the parameters within which the study is to be conducted, specifically justifying the use of qualitative research, informed by theory. This paper argues that the social world is subjective in nature and may be accessed through the interpretive approach provided by the people involved in the context of the study. The paper defines and distinguishes between qualitative and quantitative research methodologies, explores Burrell and Morgan's framework for social research, and presents the study's adopted methodology and methods, with the rationale for these choices.

Keywords: Accounting, methodologies, qualitative, quantitative research.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4925
2693 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: Semantic links, data mining, linked data, SKOS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1060
2692 Numerical Solution of Hammerstein Integral Equations by Using Quasi-Interpolation

Authors: M. Zarebnia, S. Khani

Abstract:

In this paper first, a numerical method based on quasiinterpolation for solving nonlinear Fredholm integral equations of the Hammerstein-type is presented. Then, we approximate the solution of Hammerstein integral equations by Nystrom’s method. Also, we compare the methods with some numerical examples.

Keywords: Hammerstein integral equations, quasi-interpolation, Nystrom’s method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4471
2691 Linear Elasticity Problems Solved by Using the Fictitious Domain Method and Total - FETI Domain Decomposition

Authors: Lukas Mocek, Alexandros Markopoulos

Abstract:

The main goal of this paper is to show a possibility, how to solve numerically elliptic boundary value problems arising in 2D linear elasticity by using the fictitious domain method (FDM) and the Total-FETI domain decomposition method. We briefly mention the theoretical background of these methods and demonstrate their performance on a benchmark.

Keywords: Linear elasticity, fictitious domain method, Total-FETI, domain decomposition, saddle-point system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
2690 Evaluating Complexity – Ethical Challenges in Computational Design Processes

Authors: J.Partanen

Abstract:

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Keywords: urban planning, architecture, dynamic modeling, ethics, complexity theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889
2689 One Dimensional Object Segmentation and Statistical Features of an Image for Texture Image Recognition System

Authors: Nang Thwe Thwe Oo

Abstract:

Traditional object segmentation methods are time consuming and computationally difficult. In this paper, onedimensional object detection along the secant lines is applied. Statistical features of texture images are computed for the recognition process. Example matrices of these features and formulae for calculation of similarities between two feature patterns are expressed. And experiments are also carried out using these features.

Keywords: 1-D object segmentation, secant lines, objectoccurrence(frequency) matrix, contiguity matrix, statistical features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
2688 Comparison of the DC/DC-Converters for Fuel Cell Applications

Authors: Oleksandr Krykunov

Abstract:

The source voltage of high-power fuel cell shows strong load dependence at comparatively low voltage levels. In order to provide the voltage of 750V on the DC-link for feeding electrical energy into the mains via a three phase inverter a step-up converter with a large step-up ratio is required. The output voltage of this DC/DC-converter must be stabile during variations of the load current and the voltage of the fuel cell. This paper presents the methods and results of the calculation of the efficiency and the expense for the realization for the circuits of the DC/DC-converter that meet these requirements.

Keywords: DC/DC-converter, calculation, efficiency, fuel cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2582
2687 Signature Recognition Using Conjugate Gradient Neural Networks

Authors: Jamal Fathi Abu Hasna

Abstract:

There are two common methodologies to verify signatures: the functional approach and the parametric approach. This paper presents a new approach for dynamic handwritten signature verification (HSV) using the Neural Network with verification by the Conjugate Gradient Neural Network (NN). It is yet another avenue in the approach to HSV that is found to produce excellent results when compared with other methods of dynamic. Experimental results show the system is insensitive to the order of base-classifiers and gets a high verification ratio.

Keywords: Signature Verification, MATLAB Software, Conjugate Gradient, Segmentation, Skilled Forgery, and Genuine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
2686 International E-Learning for Assuring Ergonomic Working Conditions of Orthopaedic Surgeons: First Research Outcomes from Train4OrthoMIS

Authors: J. Bartnicka, J. A. Piedrabuena, R. Portilla, L. Moyano - Cuevas, J. B. Pagador, P. Augat, J. Tokarczyk, F. M. Sánchez Margallo

Abstract:

Orthopaedic surgeries are characterized by a high degree of complexity. This is reflected by four main groups of resources: 1) surgical team which is consisted of people with different competencies, educational backgrounds and positions; 2) information and knowledge about medical and technical aspects of surgery; 3) medical equipment including surgical tools and materials; 4) space infrastructure which is important from an operating room layout point of view. These all components must be integrated and build a homogeneous organism for achieving an efficient and ergonomically correct surgical workflow. Taking this as a background, there was formulated a concept of international project, called “Online Vocational Training course on ergonomics for orthopaedic Minimally Invasive” (Train4OrthoMIS), which aim is to develop an e-learning tool available in 4 languages (English, Spanish, Polish and German). In the article, there is presented the first project research outcomes focused on three aspects: 1) ergonomic needs of surgeons who work in hospitals around different European countries, 2) the concept of structure of e-learning course, 3) the definition of tools and methods for knowledge assessment adjusted to users’ expectation.  The methodology was based on the expert panels and two types of surveys: 1) on training needs, 2) on evaluation and self-assessment preferences. The major findings of the study allowed describing the subjects of four training modules and learning sessions. According to peoples’ opinion there were defined most expected test methods which are single choice test and right after quizzes: “True or False” and “Link elements” The first project outcomes confirmed the necessity of creating a universal training tool for orthopaedic surgeons regardless of the country in which they work. Because of limited time that surgeons have, the e-learning course should be strictly adjusted to their expectation in order to be useful.

Keywords: International e-learning, ergonomics, orthopaedic surgery, Train4OrthoMIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1438
2685 An Innovative Fuzzy Decision Making Based Genetic Algorithm

Authors: M. A. Sharbafi, M. Shakiba Herfeh, Caro Lucas, A. Mohammadi Nejad

Abstract:

Several researchers have proposed methods about combination of Genetic Algorithm (GA) and Fuzzy Logic (the use of GA to obtain fuzzy rules and application of fuzzy logic in optimization of GA). In this paper, we suggest a new method in which fuzzy decision making is used to improve the performance of genetic algorithm. In the suggested method, we determine the alleles that enhance the fitness of chromosomes and try to insert them to the next generation. In this algorithm we try to present an innovative vaccination in the process of reproduction in genetic algorithm, with considering the trade off between exploration and exploitation.

Keywords: Genetic Algorithm, Fuzzy Decision Making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
2684 High Dynamic Range Resampling for Software Radio

Authors: Arthur David Snider, Laiq Azam

Abstract:

The classic problem of recovering arbitrary values of a band-limited signal from its samples has an added complication in software radio applications; namely, the resampling calculations inevitably fold aliases of the analog signal back into the original bandwidth. The phenomenon is quantified by the spur-free dynamic range. We demonstrate how a novel application of the Remez (Parks- McClellan) algorithm permits optimal signal recovery and SFDR, far surpassing state-of-the-art resamplers.

Keywords: Sampling methods, Signal sampling, Digital radio, Digital-analog conversion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
2683 Concentrated Animal Feeding Operations and Planning in the United States: Evidences from North Carolina

Authors: Asmaa Benbaba

Abstract:

This paper aims to reconsider relationships between animal feeding operations (CAFOs) and planning. It stresses the idea of the necessity for a methodological revolution in order to increase the chances for dialogue between different actors and various planning agencies and create possibilities to manage conflicts. The explored case of North Carolina shows limitations in environmental agencies’ actions and methods. It also calls for a more integrated approach among agencies including the local agencies.

Keywords: (CAFOs), North Carolina, Planning, United States.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
2682 Improved Simultaneous Performance in the Time Domain and in the Frequency Domain

Authors: Azeddine Ghodbane, David Bensoussan, Maher Hammami

Abstract:

In this study, we introduce an alternative adaptive architecture that enhances both time and frequency performance, helpfully mitigating the effects of disturbances from the input plant and external disturbances affecting the output. To facilitate superior performance in both the time and frequency domains, we have developed a user-friendly interactive design methods using the GeoGebra platform.

Keywords: Control theory, decentralized control, sensitivity theory, input-output stability theory, robust multivariable feedback control design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209
2681 Electromagnetic Tuned Mass Damper Approach for Regenerative Suspension

Authors: S. Kopylov, C. Z. Bo

Abstract:

This study is aimed at exploring the possibility of energy recovery through the suppression of vibrations. The article describes design of electromagnetic dynamic damper. The magnetic part of the device performs the function of a tuned mass damper, thereby providing both energy regeneration and damping properties to the protected mass. According to the theory of tuned mass damper, equations of mathematical models were obtained. Then, under given properties of current system, amplitude frequency response was investigated. Therefore, main ideas and methods for further research were defined.

Keywords: Electromagnetic damper, oscillations with two degrees of freedom, regeneration systems, tuned mass damper.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1099
2680 A Risk Assessment Tool for the Contamination of Aflatoxins on Dried Figs based on Machine Learning Algorithms

Authors: Kottaridi Klimentia, Demopoulos Vasilis, Sidiropoulos Anastasios, Ihara Diego, Nikolaidis Vasileios, Antonopoulos Dimitrios

Abstract:

Aflatoxins are highly poisonous and carcinogenic compounds produced by species of the genus Aspergillus spp. that can infect a variety of agricultural foods, including dried figs. Biological and environmental factors, such as population, pathogenicity and aflatoxinogenic capacity of the strains, topography, soil and climate parameters of the fig orchards are believed to have a strong effect on aflatoxin levels. Existing methods for aflatoxin detection and measurement, such as high-performance liquid chromatography (HPLC), and enzyme-linked immunosorbent assay (ELISA), can provide accurate results, but the procedures are usually time-consuming, sample-destructive and expensive. Predicting aflatoxin levels prior to crop harvest is useful for minimizing the health and financial impact of a contaminated crop. Consequently, there is interest in developing a tool that predicts aflatoxin levels based on topography and soil analysis data of fig orchards. This paper describes the development of a risk assessment tool for the contamination of aflatoxin on dried figs, based on the location and altitude of the fig orchards, the population of the fungus Aspergillus spp. in the soil, and soil parameters such as pH, saturation percentage (SP), electrical conductivity (EC), organic matter, particle size analysis (sand, silt, clay), concentration of the exchangeable cations (Ca, Mg, K, Na), extractable P and trace of elements (B, Fe, Mn, Zn and Cu), by employing machine learning methods. In particular, our proposed method integrates three machine learning techniques i.e., dimensionality reduction on the original dataset (Principal Component Analysis), metric learning (Mahalanobis Metric for Clustering) and K-nearest Neighbors learning algorithm (KNN), into an enhanced model, with mean performance equal to 85% by terms of the Pearson Correlation Coefficient (PCC) between observed and predicted values.

Keywords: aflatoxins, Aspergillus spp., dried figs, k-nearest neighbors, machine learning, prediction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 647