Search results for: fast Fourier algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4619

Search results for: fast Fourier algorithms

3569 Effect of Electromagnetic Fields on Protein Extraction from Shrimp By-Products for Electrospinning Process

Authors: Guido Trautmann-Sáez, Mario Pérez-Won, Vilbett Briones, María José Bugueño, Gipsy Tabilo-Munizaga, Luis Gonzáles-Cavieres

Abstract:

Shrimp by-products are a valuable source of protein. However, traditional protein extraction methods have limitations in terms of their efficiency. Protein extraction from shrimp (Pleuroncodes monodon) industrial by-products assisted with ohmic heating (OH), microwave (MW) and pulsed electric field (PEF). It was performed by chemical method (using NaOH and HCl 2M) assisted with OH, MW and PEF in a continuous flow system (5 ml/s). Protein determination, differential scanning calorimetry (DSC) and Fourier-transform infrared (FTIR). Results indicate a 19.25% (PEF) 3.65% (OH) and 28.19% (MW) improvement in protein extraction efficiency. The most efficient method was selected for the electrospinning process and obtaining fiber.

Keywords: electrospinning process, emerging technology, protein extraction, shrimp by-products

Procedia PDF Downloads 93
3568 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.

Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)

Procedia PDF Downloads 96
3567 Discrete PID and Discrete State Feedback Control of a Brushed DC Motor

Authors: I. Valdez, J. Perdomo, M. Colindres, N. Castro

Abstract:

Today, digital servo systems are extensively used in industrial manufacturing processes, robotic applications, vehicles and other areas. In such control systems, control action is provided by digital controllers with different compensation algorithms, which are designed to meet specific requirements for a given application. Due to the constant search for optimization in industrial processes, it is of interest to design digital controllers that offer ease of realization, improved computational efficiency, affordable return rates, and ease of tuning that ultimately improve the performance of the controlled actuators. There is a vast range of options of compensation algorithms that could be used, although in the industry, most controllers used are based on a PID structure. This research article compares different types of digital compensators implemented in a servo system for DC motor position control. PID compensation is evaluated on its two most common architectures: PID position form (1 DOF), and PID speed form (2 DOF). State feedback algorithms are also evaluated, testing two modern control theory techniques: discrete state observer for non-measurable variables tracking, and a linear quadratic method which allows a compromise between the theoretical optimal control and the realization that most closely matches it. The compared control systems’ performance is evaluated through simulations in the Simulink platform, in which it is attempted to model accurately each of the system’s hardware components. The criteria by which the control systems are compared are reference tracking and disturbance rejection. In this investigation, it is considered that the accurate tracking of the reference signal for a position control system is particularly important because of the frequency and the suddenness in which the control signal could change in position control applications, while disturbance rejection is considered essential because the torque applied to the motor shaft due to sudden load changes can be modeled as a disturbance that must be rejected, ensuring reference tracking. Results show that 2 DOF PID controllers exhibit high performance in terms of the benchmarks mentioned, as long as they are properly tuned. As for controllers based on state feedback, due to the nature and the advantage which state space provides for modelling MIMO, it is expected that such controllers evince ease of tuning for disturbance rejection, assuming that the designer of such controllers is experienced. An in-depth multi-dimensional analysis of preliminary research results indicate that state feedback control method is more satisfactory, but PID control method exhibits easier implementation in most control applications.

Keywords: control, DC motor, discrete PID, discrete state feedback

Procedia PDF Downloads 268
3566 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms

Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga

Abstract:

Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.

Keywords: anomaly detection, clustering, pattern recognition, web sessions

Procedia PDF Downloads 288
3565 Heuristics for Optimizing Power Consumption in the Smart Grid

Authors: Zaid Jamal Saeed Almahmoud

Abstract:

Our increasing reliance on electricity, with inefficient consumption trends, has resulted in several economical and environmental threats. These threats include wasting billions of dollars, draining limited resources, and elevating the impact of climate change. As a solution, the smart grid is emerging as the future power grid, with smart techniques to optimize power consumption and electricity generation. Minimizing the peak power consumption under a fixed delay requirement is a significant problem in the smart grid. In addition, matching demand to supply is a key requirement for the success of the future electricity. In this work, we consider the problem of minimizing the peak demand under appliances constraints by scheduling power jobs with uniform release dates and deadlines. As the problem is known to be NP-Hard, we propose two versions of a heuristic algorithm for solving this problem. Our theoretical analysis and experimental results show that our proposed heuristics outperform existing methods by providing a better approximation to the optimal solution. In addition, we consider dynamic pricing methods to minimize the peak load and match demand to supply in the smart grid. Our contribution is the proposal of generic, as well as customized pricing heuristics to minimize the peak demand and match demand with supply. In addition, we propose optimal pricing algorithms that can be used when the maximum deadline period of the power jobs is relatively small. Finally, we provide theoretical analysis and conduct several experiments to evaluate the performance of the proposed algorithms.

Keywords: heuristics, optimization, smart grid, peak demand, power supply

Procedia PDF Downloads 89
3564 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values

Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi

Abstract:

A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.

Keywords: eXtreme gradient boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impair, multiclass classification, ADNI, support vector machine, random forest

Procedia PDF Downloads 189
3563 Series "H154M" as a Unit Area of the Region between the Lines and Curves

Authors: Hisyam Hidayatullah

Abstract:

This world events consciously or not realize everything has a pattern, until the events of the universe according to the Big Bang theory of the solar system which makes so regular in the rotation. The author would like to create a results curve area between the quadratic function y=kx2 and line y=ka2 using GeoGebra application version 4.2. This paper can provide a series that is no less interesting with Fourier series, so that will add new material about the series can be calculated with sigma notation. In addition, the ranks of the unique natural numbers of extensive changes in established areas. Finally, this paper provides analytical and geometric proof of the vast area in between the lines and curves that give the area is formed by y=ka2 dan kurva y=kx2, x-axis, line x=√a and x=-√a make a series of numbers for k=1 and a ∈ original numbers. ∑_(i=0)^n=(4n√n)/3=0+4/3+(8√2)/3+4√3+⋯+(4n√n)/3. The author calls the series “H154M”.

Keywords: sequence, series, sigma notation, application GeoGebra

Procedia PDF Downloads 377
3562 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards

Authors: Golnush Masghati-Amoli, Paul Chin

Abstract:

Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.

Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering

Procedia PDF Downloads 136
3561 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 469
3560 Design and Implementation of Low-code Model-building Methods

Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu

Abstract:

This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.

Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment

Procedia PDF Downloads 31
3559 Electrical and Optical Properties of Polyaniline: Cadmium Sulphide Quantum Dots Nanocomposites

Authors: Akhtar Rasool, Tasneem Zahra Rizvi

Abstract:

In this study, a series of the cadmium sulphide quantum dots/polyaniline nanocomposites with varying compositions were prepared by in-situ polymerization technique and were characterized using X-ray diffraction and Fourier transform infrared spectroscopy. The surface morphology was studied by scanning electron microscopy. UV-Visible spectroscopy was used to find out the energy band gap of the nanoparticles and the nanocomposites. Temperature dependence of DC electrical conductivity and temperature and frequency dependence of AC conductivity were investigated to study the charge transport mechanism in the nanocomposites. DC conductivity was found to be a typical for a semiconducting behavior following Mott’s 1D variable range hoping model. The frequency dependent AC conductivity followed the universal power law.

Keywords: conducting polymers, nanocomposites, polyaniline composites, quantum dots

Procedia PDF Downloads 255
3558 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry

Authors: Nadia Belu, Laurenţiu Mihai Ionescu, Agnieszka Misztal

Abstract:

The automotive industry is one of the most important industries in the world that concerns not only the economy, but also the world culture. In the present financial and economic context, this field faces new challenges posed by the current crisis, companies must maintain product quality, deliver on time and at a competitive price in order to achieve customer satisfaction. Two of the most recommended techniques of quality management by specific standards of the automotive industry, in the product development, are Failure Mode and Effects Analysis (FMEA) and Control Plan. FMEA is a methodology for risk management and quality improvement aimed at identifying potential causes of failure of products and processes, their quantification by risk assessment, ranking of the problems identified according to their importance, to the determination and implementation of corrective actions related. The companies use Control Plans realized using the results from FMEA to evaluate a process or product for strengths and weaknesses and to prevent problems before they occur. The Control Plans represent written descriptions of the systems used to control and minimize product and process variation. In addition Control Plans specify the process monitoring and control methods (for example Special Controls) used to control Special Characteristics. In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.

Keywords: automotive industry, FMEA, control plan, automotive technology

Procedia PDF Downloads 406
3557 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System

Authors: Benjamin Chijioke Agwah, Paulinus Chinaenye Eze

Abstract:

Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC- VZLC provided fast tracking of desired wheel slip, eliminate chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.

Keywords: ABS, fuzzy logic controller, variable zero lag compensator, wheel slip tracking

Procedia PDF Downloads 147
3556 Introduction to Multi-Agent Deep Deterministic Policy Gradient

Authors: Xu Jie

Abstract:

As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decisionmaking problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security). By modeling the multi-job collaborative cryptographic service scheduling problem as a multiobjective optimized job flow scheduling problem, and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing, and effectively solves the problem of complex resource scheduling in cryptographic services.

Keywords: multi-agent reinforcement learning, non-stationary dynamics, multi-agent systems, cooperative and competitive agents

Procedia PDF Downloads 26
3555 Probabilistic Simulation of Triaxial Undrained Cyclic Behavior of Soils

Authors: Arezoo Sadrinezhad, Kallol Sett, S. I. Hariharan

Abstract:

In this paper, a probabilistic framework based on Fokker-Planck-Kolmogorov (FPK) approach has been applied to simulate triaxial cyclic constitutive behavior of uncertain soils. The framework builds upon previous work of the writers, and it has been extended for cyclic probabilistic simulation of triaxial undrained behavior of soils. von Mises elastic-perfectly plastic material model is considered. It is shown that by using probabilistic framework, some of the most important aspects of soil behavior under cyclic loading can be captured even with a simple elastic-perfectly plastic constitutive model.

Keywords: elasto-plasticity, uncertainty, soils, fokker-planck equation, fourier spectral method, finite difference method

Procedia PDF Downloads 379
3554 Quantification of Site Nonlinearity Based on HHT Analysis of Seismic Recordings

Authors: Ruichong Zhang

Abstract:

This study proposes a recording-based approach to characterize and quantify earthquake-induced site nonlinearity, exemplified as soil nonlinearity and/or liquefaction. Alternative to Fourier spectral analysis (FSA), the paper introduces time-frequency analysis of earthquake ground motion recordings with the aid of so-called Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the nonlinear features shown in the recordings. With the use of the 2001 Nisqually earthquake recordings, this study shows that the proposed approach is effective in characterizing site nonlinearity and quantifying the influences in seismic ground responses.

Keywords: site nonlinearity, site amplification, site damping, Hilbert-Huang Transform (HHT), liquefaction, 2001 Nisqually Earthquake

Procedia PDF Downloads 487
3553 Pretreatment of Cattail (Typha domingensis) Fibers to Obtain Cellulose Nanocrystals

Authors: Marivane Turim Koschevic, Maycon dos Santos, Marcello Lima Bertuci, Farayde Matta Fakhouri, Silvia Maria Martelli

Abstract:

Natural fibers are rich raw materials in cellulose and abundant in the world, its use for the cellulose nanocrystals extraction is promising as an example cited is the cattail, macrophyte native weed in South America. This study deals with the pre-treatment cattail of crushed fibers, at six different methods of mercerization, followed by the use of bleaching. As a result, have found The positive effects of treating fibers by means of optical microscopy and spectroscopy, Fourier transform (FTIR). The sample selected for future testing of cellulose nanocrystals extraction was treated in 2.5% NaOH for 2 h, 60 °C in the first stage and 30vol H2O2, NaOH 5% in the proportion 30/70% (v/v) for 1 hour 60 °C, followed by treatment at 50/50% (v/v) 15 minutes, 50°C, with the same constituents of the solution.

Keywords: cellulose nanocrystal, chemical treatment, mercerization, natural fibers

Procedia PDF Downloads 293
3552 Dynamics of Light Induced Current in 1D Coupled Quantum Dots

Authors: Tokuei Sako

Abstract:

Laser-induced current in a quasi-one-dimensional nanostructure has been studied by a model of a few electrons confined in a 1D electrostatic potential coupled to electrodes at both ends and subjected to a pulsed laser field. The time-propagation of the one- and two-electron wave packets has been calculated by integrating the time-dependent Schrödinger equation directly by the symplectic integrator method with uniform Fourier grid. The temporal behavior of the resultant light-induced current in the studied systems has been discussed with respect to the lifetime of the quasi-bound states formed when the static bias voltage is applied.

Keywords: pulsed laser field, nanowire, electron wave packet, quantum dots, time-dependent Schrödinger equation

Procedia PDF Downloads 357
3551 The Fast Diagnosis of Acanthamoeba Keratitis Using Real-Time PCR Assay

Authors: Fadime Eroglu

Abstract:

Acanthamoeba genus belongs to kingdom protozoa, and it is known as free-living amoebae. Acanthamoeba genus has been isolated from human bodies, swimming pools, bottled mineral water, contact lens solutions, dust, and soil. The members of the genus Acanthamoeba causes Acanthamoeba Keratitis which is a painful sight-threatening disease of the eyes. In recent years, the prevalence of Acanthamoeba keratitis has been high rate reported. The eight different Acanthamoeba species are known to be effective in Acanthamoeba keratitis. These species are Acanthamoeba castellanii, Acanthamoeba polyphaga, Acanthamoeba griffini, Acanthamoeba hatchetti, Acanthamoeba culbertsoni and Acanhtamoeba rhysodes. The conventional diagnosis of Acanthamoeba Keratitis has relied on cytological preparations and growth of Acanthamoeba in culture. However molecular methods such as real-time PCR has been found to be more sensitive. The real-time PCR has now emerged as an effective method for more rapid testing for the diagnosis of infectious disease in decade. Therefore, a real-time PCR assay for the detection of Acanthamoeba keratitis and Acanthamoeba species have been developed in this study. The 18S rRNA sequences from Acanthamoeba species were obtained from National Center for Biotechnology Information and sequences were aligned with MEGA 6 programme. Primers and probe were designed using Custom Primers-OligoPerfectTMDesigner (ThermoFisherScientific, Waltham, MA, USA). They were also assayed for hairpin formation and degree of primer-dimer formation with Multiple Primer Analyzer ( ThermoFisherScientific, Watham, MA, USA). The eight different ATCC Acanthamoeba species were obtained, and DNA was extracted using the Qiagen Mini DNA extraction kit (Qiagen, Hilden, Germany). The DNA of Acanthamoeba species were analyzed using newly designed primer and probe set in real-time PCR assay. The early definitive laboratory diagnosis of Acanthamoeba Keratitis and the rapid initiation of suitable therapy is necessary for clinical prognosis. The results of the study have been showed that new primer and probes could be used for detection and distinguish for Acanthamoeba species. These new developing methods are helpful for diagnosis of Acanthamoeba Keratitis.

Keywords: Acathamoeba Keratitis, Acanthamoeba species, fast diagnosis, Real-Time PCR

Procedia PDF Downloads 121
3550 Magnetic Nanoparticles for Protein C Purification

Authors: Duygu Çimen, Nilay Bereli, Adil Denizli

Abstract:

In this study is to synthesis magnetic nanoparticles for purify protein C. For this aim, N-Methacryloyl-(L)-histidine methyl ester (MAH) containing 2-hydroxyethyl methacrylate (HEMA) based magnetic nanoparticles were synthesized by using micro-emulsion polymerization technique for templating protein C via metal chelation. The obtained nanoparticles were characterized with Fourier transform infrared spectroscopy (FTIR), transmission electron microscopy (TEM), zeta-size analysis and electron spin resonance (ESR) spectroscopy. After that, they were used for protein C purification from aqueous solution to evaluate/optimize the adsorption condition. Hereby, the effecting factors such as concentration, pH, ionic strength, temperature, and reusability were evaluated. As the last step, protein C was determined with sodium dodecyl sulfate-polyacrylamide gel electrophoresis.

Keywords: immobilized metal affinity chromatography (IMAC), magnetic nanoparticle, protein C, hydroxyethyl methacrylate (HEMA)

Procedia PDF Downloads 426
3549 Evaluation of Natural Frequency of Single and Grouped Helical Piles

Authors: Maryam Shahbazi, Amy B. Cerato

Abstract:

The importance of a systems’ natural frequency (fn) emerges when the vibration force frequency is equivalent to foundation's fn which causes response amplitude (resonance) that may cause irreversible damage to the structure. Several factors such as pile geometry (e.g., length and diameter), soil density, load magnitude, pile condition, and physical structure affect the fn of a soil-pile system; some of these parameters are evaluated in this study. Although experimental and analytical studies have assessed the fn of a soil-pile system, few have included individual and grouped helical piles. Thus, the current study aims to provide quantitative data on dynamic characteristics of helical pile-soil systems from full-scale shake table tests that will allow engineers to predict more realistic dynamic response under motions with variable frequency ranges. To evaluate the fn of single and grouped helical piles in dry dense sand, full-scale shake table tests were conducted in a laminar box (6.7 m x 3.0 m with 4.6 m high). Two different diameters (8.8 cm and 14 cm) helical piles were embedded in the soil box with corresponding lengths of 3.66m (excluding one pile with length of 3.96) and 4.27m. Different configurations were implemented to evaluate conditions such as fixed and pinned connections. In the group configuration, all four piles with similar geometry were tied together. Simulated real earthquake motions, in addition to white noise, were applied to evaluate the wide range of soil-pile system behavior. The Fast Fourier Transform (FFT) of measured time history responses using installed strain gages and accelerometers were used to evaluate fn. Both time-history records using accelerometer or strain gages were found to be acceptable for calculating fn. In this study, the existence of a pile reduced the fn of the soil slightly. Greater fn occurred on single piles with larger l/d ratios (higher slenderness ratio). Also, regardless of the connection type, the more slender pile group which is obviously surrounded by more soil, yielded higher natural frequencies under white noise, which may be due to exhibiting more passive soil resistance around it. Relatively speaking, within both pile groups, a pinned connection led to a lower fn than a fixed connection (e.g., for the same pile group the fn’s are 5.23Hz and 4.65Hz for fixed and pinned connections, respectively). Generally speaking, a stronger motion causes nonlinear behavior and degrades stiffness which reduces a pile’s fn; even more, reduction occurs in soil with a lower density. Moreover, fn of dense sand under white noise signal was obtained 5.03 which is reduced by 44% when an earthquake with the acceleration of 0.5g was applied. By knowing the factors affecting fn, the designer can effectively match the properties of the soil to a type of pile and structure to attempt to avoid resonance. The quantitative results in this study assist engineers in predicting a probable range of fn for helical pile foundations under potential future earthquake, and machine loading applied forces.

Keywords: helical pile, natural frequency, pile group, shake table, stiffness

Procedia PDF Downloads 133
3548 Regenerated Cellulose Prepared by Using NaOH/Urea

Authors: Lee Chiau Yeng, Norhayani Othman

Abstract:

Regenerated cellulose fiber is fabricated in the NaOH/urea aqueous solution. In this work, cellulose is dissolved in 7 .wt% NaOH/12 .wt% urea in the temperature of -12 °C to prepare regenerated cellulose. Thermal and structure properties of cellulose and regenerated cellulose was compared and investigated by Field Emission Scanning Electron Microscopy (FeSEM), Fourier Transform Infrared Spectroscopy (FTIR), X-Ray Diffraction (XRD), Thermogravimetric analysis (TGA), and Differential Scanning Calorimetry. Results of FeSEM revealed that the regenerated cellulose fibers showed a more circular shape with irregular size due to fiber agglomeration. FTIR showed the difference in between the structure of cellulose and the regenerated cellulose fibers. In this case, regenerated cellulose fibers have a cellulose II crystalline structure with lower degree of crystallinity. Regenerated cellulose exhibited better thermal stability than the cellulose.

Keywords: regenerated cellulose, cellulose, NaOH, urea

Procedia PDF Downloads 431
3547 Financial Administration of Urban Local Governance: A Comparative Study of Ahmedabad Municipal Corporation (AMC) and Bhavnagar Municipal Corporation(BMC)

Authors: Aneri Mehta, Krunal Mehta

Abstract:

Financial administration is part of government which deals with collection, preservation and distribution of public funds, with the coordination of public revenue and expenditure, with the management of credit operation on behalf of the state and with the general control of the financial affairs of public households. The researcher has taken the prime body of the local self government viz. Municipal Corporation. However, the number of municipal corporations in India has rapidly increased in recent years. Countries 27% of the total population are living in urban area & in recent it increasing very fast. People are moving very fast from rural area to urban area. Their demand, awareness is increasing day by day. The Municipal Corporations render many services for the development of the urban area. Thus, researcher has taken a step to know the accounting practices of the municipal corporations of Gujarat state (AMC & BMC ). The research will try to show you the status of finance of municipal corporations. Article 243(w) of the constitution of India envisaged that the state government maybe, by law , endow the municipalities with such powers and authorities as may be necessary to enable them to function as institution of self government and such law may contain provision for devolution of powers and responsibilities upon municipalities subjects to such condition as may be specified there in with respect to (i) the peroration of plans for economic development and social justice and (ii) the performance of the function and the implementation of schemes as may be entrusted to them including those in relation to the matters listed in the twelfth schedule. The three tier structure of the Indian Government i.e. Union, State & Local Self Government is the scenario of the Indian constitution. Local Self Government performs or renders many services under the direct control of state government. They (local bodies) possess autonomy within its limited sphere, raise revenue through local taxation and spend its income on local services.

Keywords: financial administration, urban local bodies, local self government, constitution

Procedia PDF Downloads 467
3546 Investigating Selected Traditional African Medicinal Plants for Anti-fibrotic Potential: Identification and Characterization of Bioactive Compounds Through Fourier-Transform Infrared Spectroscopy and Gas Chromatography-Mass Spectrometry Analysis

Authors: G. V. Manzane, S. J. Modise

Abstract:

Uterine fibroids, also known as leiomyomas or myomas, are non-cancerous growths that develop in the muscular wall of the uterus during the reproductive years. The cause of uterine fibroids includes hormonal, genetic, growth factors, and extracellular matrix factors. Common symptoms of uterine fibroids include heavy and prolonged menstrual bleeding which can lead to a high risk of anemia, lower abdominal pains, pelvic pressure, infertility, and pregnancy loss. The growth of this tumor is a concern because of its negative impact on women’s health and the increase in their economic burden. Traditional medicinal plants have long been used in Africa for their potential therapeutic effects against various ailments. In this study, we aimed to identify and characterize bioactive compounds from selected African medicinal plants with potential anti-fibrotic properties using Fourier-transform infrared spectroscopy (FTIR) and gas chromatography-mass spectrometry (GCMS) analysis. Two medicinal plant species known for their traditional use in fibrosis-related conditions were selected for investigation. Aqueous extracts were prepared from the plant materials, and FTIR analysis was conducted to determine the functional groups present in the extracts. GCMS analysis was performed to identify the chemical constituents of the extracts. The FTIR analysis revealed the presence of various functional groups, such as phenols, flavonoids, terpenoids, and alkaloids, known for their potential therapeutic activities. These functional groups are associated with antioxidant, anti-inflammatory, and anti-fibrotic properties. The GCMS analysis identified several bioactive compounds, including flavonoids, alkaloids, terpenoids, and phenolic compounds, which are known for their pharmacological activities. The discovery of bioactive compounds in African medicinal plants that exhibit anti-fibrotic effects, opens up promising avenues for further research and development of potential treatments for fibrosis. This suggests the potential of these plants as a valuable source of novel therapeutic agents for treating fibrosis-related conditions. In conclusion, our study identified and characterized bioactive compounds from selected African medicinal plants using FTIR and GCMS analysis. The presence of compounds with known antifibrotic properties suggests that these plants hold promise as a potential source of natural products for the development of novel anti-fibrotic therapies.

Keywords: uterine fibroids, african medicinal plants, bioactive compounds, identify and characterized

Procedia PDF Downloads 103
3545 The Outcome of Using Machine Learning in Medical Imaging

Authors: Adel Edwar Waheeb Louka

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery

Procedia PDF Downloads 74
3544 Image Segmentation Using 2-D Histogram in RGB Color Space in Digital Libraries

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

This paper presents an unsupervised color image segmentation method. It is based on a hierarchical analysis of 2-D histogram in RGB color space. This histogram minimizes storage space of images and thus facilitates the operations between them. The improved segmentation approach shows a better identification of objects in a color image and, at the same time, the system is fast.

Keywords: image segmentation, hierarchical analysis, 2-D histogram, classification

Procedia PDF Downloads 380
3543 Clubhouse: A Minor Rebellion against the Algorithmic Tyranny of the Majority

Authors: Vahid Asadzadeh, Amin Ataee

Abstract:

Since the advent of social media, there has been a wave of optimism among researchers and civic activists about the influence of virtual networks on the democratization process, which has gradually waned. One of the lesser-known concerns is how to increase the possibility of hearing the voices of different minorities. According to the theory of media logic, the media, using their technological capabilities, act as a structure through which events and ideas are interpreted. Social media, through the use of the learning machine and the use of algorithms, has formed a kind of structure in which the voices of minorities and less popular topics are lost among the commotion of the trends. In fact, the recommended systems and algorithms used in social media are designed to help promote trends and make popular content more popular, and content that belongs to minorities is constantly marginalized. As social networks gradually play a more active role in politics, the possibility of freely participating in the reproduction and reinterpretation of structures in general and political structures in particular (as Laclau‎ and Mouffe had in mind‎) can be considered as criteria to democracy in action. The point is that the media logic of virtual networks is shaped by the rule and even the tyranny of the majority, and this logic does not make it possible to design a self-foundation and self-revolutionary model of democracy. In other words, today's social networks, though seemingly full of variety But they are governed by the logic of homogeneity, and they do not have the possibility of multiplicity as is the case in immanent radical democracies (influenced by Gilles Deleuze). However, with the emergence and increasing popularity of Clubhouse as a new social media, there seems to be a shift in the social media space, and that is the diminishing role of algorithms and systems reconditioners as content delivery interfaces. This has led to the fact that in the Clubhouse, the voices of minorities are better heard, and the diversity of political tendencies manifests itself better. The purpose of this article is to show, first, how social networks serve the elimination of minorities in general, and second, to argue that the media logic of social networks must adapt to new interpretations of democracy that give more space to minorities and human rights. Finally, this article will show how the Clubhouse serves the new interpretations of democracy at least in a minimal way. To achieve the mentioned goals, in this article by a descriptive-analytical method, first, the relation between media logic and postmodern democracy will be inquired. The political economy popularity in social media and its conflict with democracy will be discussed. Finally, it will be explored how the Clubhouse provides a new horizon for the concepts embodied in radical democracy, a horizon that more effectively serves the rights of minorities and human rights in general.

Keywords: algorithmic tyranny, Clubhouse, minority rights, radical democracy, social media

Procedia PDF Downloads 147
3542 Random Walks and Option Pricing for European and American Options

Authors: Guillaume Leduc

Abstract:

In this paper, we describe a broad setting under which the error of the approximation can be quantified, controlled, and for which convergence occurs at a speed of n⁻¹ for European and American options. We describe how knowledge of the error allows for arbitrarily fast acceleration of the convergence.

Keywords: random walk approximation, European and American options, rate of convergence, option pricing

Procedia PDF Downloads 464
3541 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis

Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel

Abstract:

Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.

Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI

Procedia PDF Downloads 174
3540 Preparation and Characterization of Copper-Nanoparticle on Extracted Carrageenan and Its Catalytic Activity for Reducing Aromatic Nitro Group

Authors: Vida Jodaeian, Behzad Sani

Abstract:

Copper nanoparticles were successfully synthesized and characterized on green-extracted Carrageenan from seaweed by precipitation method without using any supporter and template with precipitation method. The crystallinity, optical properties, morphology, and composition of products were characterized by X-ray diffraction (XRD), transmission electron microscopy (TEM), and Fourier transforms infrared (FT-IR) spectroscopy. The effects of processing parameters on the size and shape of Cu- nanostructures such as effect of pH were investigated. It is found that the reaction at lower pH values (acidic) could not be completed and pH = 8.00 was the best pH value to prepare very fine nanoparticles. They as synthesized Cu-nanoparticles were used as catalysts for the reduction of aromatic nitro compounds in presence of NaBH4. The results showed that Cu-nanoparticles are very active for reduction of these nitro aromatic compounds.

Keywords: nanoparticles, carrageenan, seaweed, nitro aromatic compound

Procedia PDF Downloads 399