Search results for: fast Fourier algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4622

Search results for: fast Fourier algorithms

2582 Li-Fi Technology: Data Transmission through Visible Light

Authors: Shahzad Hassan, Kamran Saeed

Abstract:

People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.

Keywords: communication, LED, Li-Fi, Wi-Fi

Procedia PDF Downloads 349
2581 Cryptography and Cryptosystem a Panacea to Security Risk in Wireless Networking

Authors: Modesta E. Ezema, Chikwendu V. Alabekee, Victoria N. Ishiwu, Ifeyinwa NwosuArize, Chinedu I. Nwoye

Abstract:

The advent of wireless networking in computing technology cannot be overemphasized, it opened up easy accessibility to information resources, networking made easier and brought internet accessibility to our doorsteps, but despite all these, some mishap came in with it that is causing mayhem in today ‘s overall information security. The cyber criminals will always compromise the integrity of a message that is not encrypted or that is encrypted with a weak algorithm.In other to correct the mayhem, this study focuses on cryptosystem and cryptography. This ensures end to end crypt messaging. The study of various cryptographic algorithms, as well as the techniques and applications of the cryptography for efficiency, were all considered in the work., present and future applications of cryptography were dealt with as well as Quantum Cryptography was exposed as the current and the future area in the development of cryptography. An empirical study was conducted to collect data from network users.

Keywords: algorithm, cryptography, cryptosystem, network

Procedia PDF Downloads 352
2580 Malware Detection in Mobile Devices by Analyzing Sequences of System Calls

Authors: Jorge Maestre Vidal, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

With the increase in popularity of mobile devices, new and varied forms of malware have emerged. Consequently, the organizations for cyberdefense have echoed the need to deploy more effective defensive schemes adapted to the challenges posed by these recent monitoring environments. In order to contribute to their development, this paper presents a malware detection strategy for mobile devices based on sequence alignment algorithms. Unlike the previous proposals, only the system calls performed during the startup of applications are studied. In this way, it is possible to efficiently study in depth, the sequences of system calls executed by the applications just downloaded from app stores, and initialize them in a secure and isolated environment. As demonstrated in the performed experimentation, most of the analyzed malicious activities were successfully identified in their boot processes.

Keywords: android, information security, intrusion detection systems, malware, mobile devices

Procedia PDF Downloads 305
2579 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 131
2578 The Boundary Element Method in Excel for Teaching Vector Calculus and Simulation

Authors: Stephen Kirkup

Abstract:

This paper discusses the implementation of the boundary element method (BEM) on an Excel spreadsheet and how it can be used in teaching vector calculus and simulation. There are two separate spreadheets, within which Laplace equation is solved by the BEM in two dimensions (LIBEM2) and axisymmetric three dimensions (LBEMA). The main algorithms are implemented in the associated programming language within Excel, Visual Basic for Applications (VBA). The BEM only requires a boundary mesh and hence it is a relatively accessible method. The BEM in the open spreadsheet environment is demonstrated as being useful as an aid to teaching and learning. The application of the BEM implemented on a spreadsheet for educational purposes in introductory vector calculus and simulation is explored. The development of assignment work is discussed, and sample results from student work are given. The spreadsheets were found to be useful tools in developing the students’ understanding of vector calculus and in simulating heat conduction.

Keywords: boundary element method, Laplace’s equation, vector calculus, simulation, education

Procedia PDF Downloads 164
2577 Designing, Preparation and Structural Evaluation of Co-Crystals of Oxaprozin

Authors: Maninderjeet K. Grewal, Sakshi Bhatnor, Renu Chadha

Abstract:

The composition of pharmaceutical entities and the molecular interactions can be altered to optimize drug properties such as solubility and bioavailability by the crystal engineering technique. The present work has emphasized on the preparation, characterization, and biopharmaceutical evaluation of co-crystal of BCS Class II anti-osteoarthritis drug, Oxaprozin (OXA) with aspartic acid (ASPA) as co-former. The co-crystals were prepared through the mechanochemical solvent drop grinding method. Characterization of the prepared co-crystal (OXA-ASPA) was done by using analytical tools such as differential scanning calorimetry (DSC), Fourier transform infrared spectroscopy (FT-IR), powder X-ray diffraction (PXRD). DSC thermogram of OXA-ASPA cocrystal showed a single sharp melting endotherm at 235 ºC, which was between the melting peaks of the drug and the counter molecules suggesting the formation of a new phase which is a co-crystal that was further confirmed by using other analytical techniques. FT-IR analysis of OXA-ASPA cocrystal showed a shift in a hydroxyl, carbonyl, and amine peaks as compared to pure drugs indicating all these functional groups are participating in cocrystal formation. The appearance of new peaks in the PXRD pattern of cocrystals in comparison to individual components showed that a new crystalline entity has been formed. The Crystal structure of cocrystal was determined using material studio software (Biovia) from PXRD. The equilibrium solubility study of OXA-ASPA showed improvement in solubility as compared to pure drug. Therefore, it was envisioned to prepare the co-crystal of oxaprozin with a suitable conformer to modulate its physiochemical properties and consequently, the biopharmaceutical parameters.

Keywords: cocrystals, coformer, oxaprozin, solubility

Procedia PDF Downloads 117
2576 Kinetic Study on Extracting Lignin from Black Liquor Using Deep Eutectic Solvents

Authors: Fatemeh Saadat Ghareh Bagh, Srimanta Ray, Jerald Lalman

Abstract:

Lignin, the largest inventory of organic carbon with a high caloric energy value is a major component in woody and non-woody biomass. In pulping mills, a large amount of the lignin is burned for energy. At the same time, the phenolic structure of lignin enables it to be converted to value-added compounds.This study has focused on extracting lignin from black liquor using deep eutectic solvents (DESs). Therefore, three choline chloride (ChCl)-DESs paired with lactic acid (LA) (1:11), oxalic acid.2H₂O (OX) (1:4), and malic acid (MA) (1:3) were synthesized at 90oC and atmospheric pressure. The kinetics of lignin recovery from black liquor using DES was investigated at three moderate temperatures (338, 353, and 368 K) at time intervals from 30 to 210 min. The extracted lignin (acid soluble lignin plus Klason lignin) was characterized by Fourier transform infrared spectroscopy (FTIR). The FTIR studies included comparing the extracted lignin with a model Kraft lignin. The extracted lignin was characterized spectrophotometrically to determine the acid soluble lignin (ASL) [TAPPI UM 250] fraction and Klason lignin was determined gravimetrically using TAPPI T 222 om02. The lignin extraction reaction using DESs was modeled by first-order reaction kinetics and the activation energy of the process was determined. The ChCl:LA-DES recovered lignin was 79.7±2.1% at 368K and a DES:BL ratio of 4:1 (v/v). The quantity of lignin extracted for the control solvent, [emim][OAc], was 77.5+2.2%. The activation energy measured for the LA-DES system was 22.7 KJ mol⁻¹, while the activation energy for the OX-DES and MA-DES systems were 7.16 KJ·mol⁻¹ and 8.66 KJ·mol⁻¹ when the total lignin recovery was 75.4 ±0.9% and 62.4 ±1.4, % respectively.

Keywords: black liquor, deep eutectic solvents, kinetics, lignin

Procedia PDF Downloads 150
2575 Inventory Policy with Continuous Price Reduction in Solar Photovoltaic Supply Chain

Authors: Xiangrong Liu, Chuanhui Xiong

Abstract:

With the concern of large pollution emissions from coal-fired power plants and new commitment to green energy, global solar power industry was emerging recently. Due to the advanced technology, the price of solar photovoltaic(PV) module was reduced at a fast rate, which arose an interesting but challenge question to solar supply chain. This research is modeling the inventory strategies for a PV supply chain with a PV manufacturer, an assembler and an end customer. Through characterizing the manufacturer's and PV assembler's optimal decision in decentralized and centralized situation, this study shed light on how to improve supply chain performance through parameters setting in the contract design. The results suggest the assembler to lower the optimal stock level gradually each period before price reduction and set up a newsvendor base-stock policy in all periods after price reduction. As to the PV module manufacturer, a non-stationary produce-up-to policy is optimal.

Keywords: photovoltaic, supply chain, inventory policy, base-stock policy

Procedia PDF Downloads 350
2574 Modelling of Polymeric Fluid Flows between Two Coaxial Cylinders Taking into Account the Heat Dissipation

Authors: Alexander Blokhin, Ekaterina Kruglova, Boris Semisalov

Abstract:

Mathematical model based on the mesoscopic theory of polymer dynamics is developed for numerical simulation of the flows of polymeric liquid between two coaxial cylinders. This model is a system of nonlinear partial differential equations written in the cylindrical coordinate system and coupled with the heat conduction equation including a specific dissipation term. The stationary flows similar to classical Poiseuille ones are considered, and the resolving equations for the velocity of flow and for the temperature are obtained. For solving them, a fast pseudospectral method is designed based on Chebyshev approximations, that enables one to simulate the flows through the channels with extremely small relative values of the radius of inner cylinder. The numerical analysis of the dependance of flow on this radius and on the values of dissipation constant is done.

Keywords: dynamics of polymeric liquid, heat dissipation, singularly perturbed problem, pseudospectral method, Chebyshev polynomials, stabilization technique

Procedia PDF Downloads 293
2573 Empirical Evaluation of Gradient-Based Training Algorithms for Ordinary Differential Equation Networks

Authors: Martin K. Steiger, Lukas Heisler, Hans-Georg Brachtendorf

Abstract:

Deep neural networks and their variants form the backbone of many AI applications. Based on the so-called residual networks, a continuous formulation of such models as ordinary differential equations (ODEs) has proven advantageous since different techniques may be applied that significantly increase the learning speed and enable controlled trade-offs with the resulting error at the same time. For the evaluation of such models, high-performance numerical differential equation solvers are used, which also provide the gradients required for training. However, whether classical gradient-based methods are even applicable or which one yields the best results has not been discussed yet. This paper aims to redeem this situation by providing empirical results for different applications.

Keywords: deep neural networks, gradient-based learning, image processing, ordinary differential equation networks

Procedia PDF Downloads 173
2572 Synthesis of New Bio-Based Solid Polymer Electrolyte Polyurethane-Liclo4 via Prepolymerization Method: Effect of NCO/OH Ratio on Their Chemical, Thermal Properties and Ionic Conductivity

Authors: C. S. Wong, K. H. Badri, N. Ataollahi, K. P. Law, M. S. Su’ait, N. I. Hassan

Abstract:

Novel bio-based polymer electrolyte was synthesized with LiClO4 as the main source of charge carrier. Initially, polyurethane-LiClO4 polymer electrolytes were synthesized via polymerization method with different NCO/OH ratios and labelled as PU1, PU2, PU3, and PU4. Subsequently, the chemical, thermal properties and ionic conductivity of the films produced were determined. Fourier transform infrared (FTIR) analysis indicates the co-ordination between Li+ ion and polyurethane in PU1 due to the greatest amount of hard segment of polyurethane in PU1 as proven by soxhlet analysis. The structures of polyurethanes were confirmed by 13 nuclear magnetic resonance spectroscopy (13C NMR) and FTIR spectroscopy. Differential scanning calorimetry (DSC) analysis indicates PU 1 has the highest glass transition temperature (Tg) corresponds to the most abundant urethane group which is the hard segment in PU1. Scanning electron microscopy (SEM) of the PU-LiClO4 shows the good miscibility between lithium salt and the polymer. The study found that PU1 possessed the greatest ionic conductivity (1.19 × 10-7 S.cm-1 at 298 K and 5.01 × 10-5 S.cm-1 at 373 K) and the lowest activation energy, Ea (0.32 eV) due to the greatest amount of hard segment formed in PU 1 induces the coordination between lithium ion and oxygen atom of carbonyl group in polyurethane. All the polyurethanes exhibited linear Arrhenius variations indicating ion transport via simple lithium ion hopping in polyurethane. This research proves the NCO content in polyurethane plays an important role in affecting the ionic conductivity of this polymer electrolyte.

Keywords: ionic conductivity, palm kernel oil-based monoester-OH, polyurethane, solid polymer electrolyte

Procedia PDF Downloads 429
2571 An Enhanced Particle Swarm Optimization Algorithm for Multiobjective Problems

Authors: Houda Abadlia, Nadia Smairi, Khaled Ghedira

Abstract:

Multiobjective Particle Swarm Optimization (MOPSO) has shown an effective performance for solving test functions and real-world optimization problems. However, this method has a premature convergence problem, which may lead to lack of diversity. In order to improve its performance, this paper presents a hybrid approach which embedded the MOPSO into the island model and integrated a local search technique, Variable Neighborhood Search, to enhance the diversity into the swarm. Experiments on two series of test functions have shown the effectiveness of the proposed approach. A comparison with other evolutionary algorithms shows that the proposed approach presented a good performance in solving multiobjective optimization problems.

Keywords: particle swarm optimization, migration, variable neighborhood search, multiobjective optimization

Procedia PDF Downloads 171
2570 Assesment of the Economic Potential of Lead Contaminated Brownfield for Growth of Oil Producing Crop Like Helianthus annus (Sunflower)

Authors: Shahenaz Sidi, S. K. Tank

Abstract:

When sparsely used industrial and commercial facilities are retired or abandoned, one of the biggest issues that arise is what to do with the remaining land. This land, referred to as a ‘Brownfield site’ or simply ‘Brownfield’ is often contaminated with waste and pollutants left behind by the defunct industrial facilities and factories that stand on the land. Phytoremediation has been proved a promising greener and cleaner technology in remediating the land unlike other chemical excavation methods. Helianthus annus is a hyper accumulator of lead. Helianthus annus can be used for remediation procedures in metal contaminated soils. It is a fast-growing crop which would favour soil stabilization. Its tough leaves and stems are rarely eaten by animals. The seeds (actively eaten by birds) have very low concentrations of potentially toxic elements, and represent low risk for the food web. The study is conducted to determine the phytoextraction potentials of the plant and the eventual seed harvesting and commercial oil production on remediated soil.

Keywords: Brownfield, phytoextraction, helianthus, oil, commercial

Procedia PDF Downloads 339
2569 Strengthening Bridge Piers by Carbon Fiber Reinforced Polymer (CFRP): A Case Study for Thuan Phuoc Suspension Bridge in Vietnam

Authors: Lan Nguyen, Lam Cao Van

Abstract:

Thuan Phuoc is a suspension bridge built in Danang city, Vietnam. Because this bridge locates near the estuary, its structure has degraded rapidly. Many cracks have currently occurred on most of the concrete piers of the curved approach spans. This paper aims to present the results of diagnostic analysis of causes for cracks as well as some calculations for strengthening piers by carbon fiber reinforced polymer (CFRP). Besides, it describes how to use concrete nonlinear analysis software ATENA to diagnostically analyze cracks, strengthening designs. Basing on the results of studying the map of distributing crack on Thuan Phuoc bridge’s concrete piers is analyzed by the software ATENA is suitable for the real conditions and CFRP would be the best solution to strengthen piers in a sound and fast way.

Keywords: ATENA, bridge pier strengthening, carbon fiber reinforced polymer (CFRP), crack prediction analysis

Procedia PDF Downloads 243
2568 Graphen-Based Nanocomposites for Glucose and Ethanol Enzymatic Biosensor Fabrication

Authors: Tesfaye Alamirew, Delele Worku, Solomon W. Fanta, Nigus Gabbiye

Abstract:

Recently graphen based nanocomposites are become an emerging research areas for fabrication of enzymatic biosensors due to their property of large surface area, conductivity and biocompatibility. This review summarizes recent research reports of graphen based nanocomposites for the fabrication of glucose and ethanol enzymatic biosensors. The newly fabricated enzyme free microwave treated nitrogen doped graphen (MN-d-GR) had provided highest sensitivity towards glucose and GCE/rGO/AuNPs/ADH composite had provided far highest sensitivity towards ethanol compared to other reported graphen based nanocomposites. The MWCNT/GO/GOx and GCE/ErGO/PTH/ADH nanocomposites had also enhanced wide linear range for glucose and ethanol detection respectively. Generally, graphen based nanocomposite enzymatic biosensors had fast direct electron transfer rate, highest sensitivity and wide linear detection ranges during glucose and ethanol sensing.

Keywords: glucose, ethanol, enzymatic biosensor, graphen, nanocomposite

Procedia PDF Downloads 128
2567 Parallelization by Domain Decomposition for 1-D Sugarcane Equation with Message Passing Interface

Authors: Ewedafe Simon Uzezi

Abstract:

In this paper we presented a method based on Domain Decomposition (DD) for parallelization of 1-D Sugarcane Equation on parallel platform with parallel paradigms on Master-Slave platform using Message Passing Interface (MPI). The 1-D Sugarcane Equation was discretized using explicit method of discretization requiring evaluation nof temporal and spatial distribution of temperature. This platform gives better predictions of the effects of temperature distribution of the sugarcane problem. This work presented parallel overheads with overlapping communication and communication across parallel computers with numerical results across different block sizes with scalability. However, performance improvement strategies from the DD on various mesh sizes were compared experimentally and parallel results show speedup and efficiency for the parallel algorithms design.

Keywords: sugarcane, parallelization, explicit method, domain decomposition, MPI

Procedia PDF Downloads 27
2566 Load Balancing Algorithms for SIP Server Clusters in Cloud Computing

Authors: Tanmay Raj, Vedika Gupta

Abstract:

For its groundbreaking and substantial power, cloud computing is today’s most popular breakthrough. It is a sort of Internet-based computing that allows users to request and receive numerous services in a cost-effective manner. Virtualization, grid computing, and utility computing are the most widely employed emerging technologies in cloud computing, making it the most powerful. However, cloud computing still has a number of key challenges, such as security, load balancing, and non-critical failure adaption, to name a few. The massive growth of cloud computing will put an undue strain on servers. As a result, network performance will deteriorate. A good load balancing adjustment can make cloud computing more productive and in- crease client fulfillment execution. Load balancing is an important part of cloud computing because it prevents certain nodes from being overwhelmed while others are idle or have little work to perform. Response time, cost, throughput, performance, and resource usage are all parameters that may be improved using load balancing.

Keywords: cloud computing, load balancing, computing, SIP server clusters

Procedia PDF Downloads 128
2565 Design and Simulation of Variable Air Volume Air Conditioning System Based on Improved Sliding Mode Control

Authors: Abbas Anser, Ahmad Irfan

Abstract:

The main purpose of the VAV (Variable Air Volume) in Heating, Ventilation, and Air Conditioning (HVAC) system is to reduce energy consumption and make the buildings comfortable for the occupants. For better performance of the air conditioning system, different control techniques have been developed. In this paper, an Improved Sliding Mode Control (ISMC), based on Power Rate Exponential Reaching Law (PRERL), has been implemented on a VAV air conditioning system. Through the proposed technique, fast response and robustness have been achieved. To verify the efficacy of ISMC, a comparison of the suggested control technique has been made with Exponential Reaching Law (ERL) based SMC. And secondly, chattering, which is unfavorable as it deteriorates the mechanical parts of the air conditioning system by the continuous movement of the mechanical parts and consequently it increases the energy loss in the air conditioning system, has been alleviated. MATLAB/SIMULINK results show the effectiveness of the utilized scheme, which ensures the enhancement of the energy efficiency of the VAV air conditioning system.

Keywords: PID, SMC, HVAC, PRERL, feedback linearization, VAV, chattering

Procedia PDF Downloads 126
2564 Aspen Plus Simulation of Saponification of Ethyl Acetate in the Presence of Sodium Hydroxide in a Plug Flow Reactor

Authors: U. P. L. Wijayarathne, K. C. Wasalathilake

Abstract:

This work presents the modelling and simulation of saponification of ethyl acetate in the presence of sodium hydroxide in a plug flow reactor using Aspen Plus simulation software. Plug flow reactors are widely used in the industry due to the non-mixing property. The use of plug flow reactors becomes significant when there is a need for continuous large scale reaction or fast reaction. Plug flow reactors have a high volumetric unit conversion as the occurrence for side reactions is minimum. In this research Aspen Plus V8.0 has been successfully used to simulate the plug flow reactor. In order to simulate the process as accurately as possible HYSYS Peng-Robinson EOS package was used as the property method. The results obtained from the simulation were verified by the experiment carried out in the EDIBON plug flow reactor module. The correlation coefficient (r2) was 0.98 and it proved that simulation results satisfactorily fit for the experimental model. The developed model can be used as a guide for understanding the reaction kinetics of a plug flow reactor.

Keywords: aspen plus, modelling, plug flow reactor, simulation

Procedia PDF Downloads 604
2563 Development and Application of the Proctoring System with Face Recognition for User Registration on the Educational Information Portal

Authors: Meruyert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova, Madina Ermaganbetova

Abstract:

This research paper explores the process of creating a proctoring system by evaluating the implementation of practical face recognition algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As an outcome, a proctoring system will be created, enabling the conduction of tests and ensuring academic integrity checks within the system. Due to the correct operation of the system, test works are carried out. The result of the creation of the proctoring system will be the basis for the automation of the informational, educational portal developed by machine learning.

Keywords: artificial intelligence, education portal, face recognition, machine learning, proctoring

Procedia PDF Downloads 130
2562 Antibacterial Activity of Silver Nanoparticles of Extract of Leaf of Nauclea latifolia (Sm.) against Some Selected Clinical Isolates

Authors: Mustapha Abdulsalam, R. N. Ahmed

Abstract:

Nauclea latifolia is one of the medicinal plants used in traditional Nigerian medicine in the treatment of various diseases such as fever, toothaches, malaria, diarrhea among several other conditions. Nauclea latifolia leaf extract acts as a capping and reducing agent in the formation of silver nanoparticles. Silver nanoparticles (AgNPs) were synthesized using a combination of aqueous extract of Nauclea latifolia and 1mM of silver nitrate (AgNO₃) solution to obtain concentrations of 100mg/ml-400mg/ml. Characterization of the particles was done by UV-Vis spectroscopy and Fourier transform infrared (FTIR). In this study, aqueous as well as ethanolic extract of leaf of Nauclea latifolia were investigated for antibacterial activity using the standard agar well diffusion technique against three clinical isolates (Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa). The Minimum Inhibitory Concentration (MIC) was achieved by microbroth dilution method and Minimum Bactericidal Concentration (MBC) was also determined by plate assay. Characterization by UV-visible spectrometry revealed peak absorbance of 0.463 at 450.0nm, while FTIR showed the presence of two functional groups. At 400mg/ml, the highest inhibitory activities were observed with S.aureus and E.coli with zones of inhibition measuring 20mm and 18mm respectively. The MIC was obtained at 400mg/ml while MBC was at a higher concentration. The data from this study indicate the potential of silver nanoparticle of Nauclea latifolia as a suitable alternative antibacterial agent for incorporation into orthodox medicine in health care delivery in Nigeria.

Keywords: agar well diffusion, antimicrobial activity, Nauclea latifolia, silver nanoparticles

Procedia PDF Downloads 209
2561 Use of Magnetically Separable Molecular Imprinted Polymers for Determination of Pesticides in Food Samples

Authors: Sabir Khan, Sajjad Hussain, Ademar Wong, Maria Del Pilar Taboada Sotomayor

Abstract:

The present work aims to develop magnetic molecularly imprinted polymers (MMIPs) for determination of a selected pesticide (ametryne) using high-performance liquid chromatography (HPLC). Computational simulation can assist the choice of the most suitable monomer for the synthesis of polymers. The (MMIPs) were polymerized at the surface of Fe3O4@SiO2 magnetic nanoparticles (MNPs) using 2-vinylpyradine as functional monomer, ethylene-glycol-dimethacrylate (EGDMA) is a cross-linking agent and 2,2-Azobisisobutyronitrile (AIBN) used as radical initiator. Magnetic non-molecularly imprinted polymer (MNIPs) was also prepared under the same conditions without analyte. The MMIPs were characterized by scanning electron microscopy (SEM), Brunauer, Emmett and Teller (BET) and Fourier transform infrared spectroscopy (FTIR). Pseudo first-order and pseudo second order model were applied to study kinetics of adsorption and it was found that adsorption process followed the pseudo-first-order kinetic model. Adsorption equilibrium data was fitted to Freundlich and Langmuir isotherms and the sorption equilibrium process was well described by Langmuir isotherm mode. The selectivity coefficients (α) of MMIPs for ametryne with respect to atrazine, ciprofloxacin and folic acid were 4.28, 12.32 and 14.53 respectively. The spiked recoveries ranged between 91.33 and 106.80% were obtained. The results showed high affinity and selectivity of MMIPs for pesticide ametryne in the food samples.

Keywords: molecularly imprinted polymer, pesticides, magnetic nanoparticles, adsorption

Procedia PDF Downloads 468
2560 Modelling Railway Noise Over Large Areas, Assisted by GIS

Authors: Conrad Weber

Abstract:

The modelling of railway noise over large projects areas can be very time consuming in terms of preparing the noise models and calculation time. An open-source GIS program has been utilised to assist with the modelling of operational noise levels for 675km of railway corridor. A range of GIS algorithms were utilised to break up the noise model area into manageable calculation sizes. GIS was utilised to prepare and filter a range of noise modelling inputs, including building files, land uses and ground terrain. A spreadsheet was utilised to manage the accuracy of key input parameters, including train speeds, train types, curve corrections, bridge corrections and engine notch settings. GIS was utilised to present the final noise modelling results. This paper explains the noise modelling process and how the spreadsheet and GIS were utilised to accurately model this massive project efficiently.

Keywords: noise, modeling, GIS, rail

Procedia PDF Downloads 124
2559 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider

Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf

Abstract:

We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approach

Keywords: top tagger, multivariate, deep learning, LHC, single top

Procedia PDF Downloads 113
2558 Efficient Recommendation System for Frequent and High Utility Itemsets over Incremental Datasets

Authors: J. K. Kavitha, D. Manjula, U. Kanimozhi

Abstract:

Mining frequent and high utility item sets have gained much significance in the recent years. When the data arrives sporadically, incremental and interactive rule mining and utility mining approaches can be adopted to handle user’s dynamic environmental needs and avoid redundancies, using previous data structures, and mining results. The dependence on recommendation systems has exponentially risen since the advent of search engines. This paper proposes a model for building a recommendation system that suggests frequent and high utility item sets over dynamic datasets for a cluster based location prediction strategy to predict user’s trajectories using the Efficient Incremental Rule Mining (EIRM) algorithm and the Fast Update Utility Pattern Tree (FUUP) algorithm. Through comprehensive evaluations by experiments, this scheme has shown to deliver excellent performance.

Keywords: data sets, recommendation system, utility item sets, frequent item sets mining

Procedia PDF Downloads 296
2557 A Fast Convergence Subband BSS Structure

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin

Abstract:

A blind source separation method is proposed; in this method we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full bandwidth, and can promote better rates of convergence.

Keywords: blind source separation, computational complexity, subband, convergence speed, mixture

Procedia PDF Downloads 558
2556 Simple Modified Method for DNA Isolation from Lyophilised Cassava Storage Roots (Manihot esculenta Crantz.)

Authors: P. K. Telengech, K. Monjero, J. Maling’a, A. Nyende, S. Gichuki

Abstract:

There is need to identify an efficient protocol for use in extraction of high quality DNA for purposes of molecular work. Cassava roots are known for their high starch content, polyphenols and other secondary metabolites which interfere with the quality of the DNA. These factors have negative interference on the various methodologies for DNA extraction. There is need to develop a simple, fast and inexpensive protocol that yields high quality DNA. In this improved Dellaporta method, the storage roots are lyophilized to reduce the water content; the extraction buffer is modified to eliminate the high polyphenols, starch and wax. This simple protocol was compared to other protocols intended for plants with similar secondary metabolites. The method gave high yield (300-950ng) and pure DNA for use in PCR analysis. This improved Dellaporta protocol allows isolation of pure DNA from starchy cassava storage roots.

Keywords: cassava storage roots, dellaporta, DNA extraction, lyophilisation, polyphenols secondary metabolites

Procedia PDF Downloads 366
2555 Experimental Characterization of the Color Quality and Error Rate for an Red, Green, and Blue-Based Light Emission Diode-Fixture Used in Visible Light Communications

Authors: Juan F. Gutierrez, Jesus M. Quintero, Diego Sandoval

Abstract:

An important feature of LED technology is the fast on-off commutation, which allows data transmission. Visible Light Communication (VLC) is a wireless method to transmit data with visible light. Modulation formats such as On-Off Keying (OOK) and Color Shift Keying (CSK) are used in VLC. Since CSK is based on three color bands uses red, green, and blue monochromatic LED (RGB-LED) to define a pattern of chromaticities. This type of CSK provides poor color quality in the illuminated area. This work presents the design and implementation of a VLC system using RGB-based CSK with 16, 8, and 4 color points, mixing with a steady baseline of a phosphor white-LED, to improve the color quality of the LED-Fixture. The experimental system was assessed in terms of the Color Rendering Index (CRI) and the Symbol Error Rate (SER). Good color quality performance of the LED-Fixture was obtained with an acceptable SER. The laboratory setup used to characterize and calibrate an LED-Fixture is described.

Keywords: VLC, indoor lighting, color quality, symbol error rate, color shift keying

Procedia PDF Downloads 102
2554 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 89
2553 Forecasting Free Cash Flow of an Industrial Enterprise Using Fuzzy Set Tools

Authors: Elena Tkachenko, Elena Rogova, Daria Koval

Abstract:

The paper examines the ways of cash flows forecasting in the dynamic external environment. The so-called new reality in economy lowers the predictability of the companies’ performance indicators due to the lack of long-term steady trends in external conditions of development and fast changes in the markets. The traditional methods based on the trend analysis lead to a very high error of approximation. The macroeconomic situation for the last 10 years is defined by continuous consequences of financial crisis and arising of another one. In these conditions, the instruments of forecasting on the basis of fuzzy sets show good results. The fuzzy sets based models turn out to lower the error of approximation to acceptable level and to provide the companies with reliable cash flows estimation that helps to reach the financial stability. In the paper, the applicability of the model of cash flows forecasting based on fuzzy logic was analyzed.

Keywords: cash flow, industrial enterprise, forecasting, fuzzy sets

Procedia PDF Downloads 211