Search results for: applied computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9048

Search results for: applied computing

8628 EECS: Reimagining the Future of Technology Education through Electrical Engineering and Computer Science Integration

Authors: Yousef Sharrab, Dimah Al-Fraihat, Monther Tarawneh, Aysh Alhroob, Ala’ Khalifeh, Nabil Sarhan

Abstract:

This paper explores the evolution of Electrical Engineering (EE) and Computer Science (CS) education in higher learning, examining the feasibility of unifying them into Electrical Engineering and Computer Science (EECS) for the technology industry. It delves into the historical reasons for their separation and underscores the need for integration. Emerging technologies such as AI, Virtual Reality, IoT, Cloud Computing, and Cybersecurity demand an integrated EE and CS program to enhance students' understanding. The study evaluates curriculum integration models, drawing from prior research and case studies, demonstrating how integration can provide students with a comprehensive knowledge base for industry demands. Successful integration necessitates addressing administrative and pedagogical challenges. For academic institutions considering merging EE and CS programs, the paper offers guidance, advocating for a flexible curriculum encompassing foundational courses and specialized tracks in computer engineering, software engineering, bioinformatics, information systems, data science, AI, robotics, IoT, virtual reality, cybersecurity, and cloud computing. Elective courses are emphasized to keep pace with technological advancements. Implementing this integrated approach can prepare students for success in the technology industry, addressing the challenges of a technologically advanced society reliant on both EE and CS principles. Integrating EE and CS curricula is crucial for preparing students for the future.

Keywords: electrical engineering, computer science, EECS, curriculum integration of EE and CS

Procedia PDF Downloads 33
8627 An Investigation of Performance Versus Security in Cognitive Radio Networks with Supporting Cloud Platforms

Authors: Kurniawan D. Irianto, Demetres D. Kouvatsos

Abstract:

The growth of wireless devices affects the availability of limited frequencies or spectrum bands as it has been known that spectrum bands are a natural resource that cannot be added. Many studies about available spectrum have been done and it shows that licensed frequencies are idle most of the time. Cognitive radio is one of the solutions to solve those problems. Cognitive radio is a promising technology that allows the unlicensed users known as secondary users (SUs) to access licensed bands without making interference to licensed users or primary users (PUs). As cloud computing has become popular in recent years, cognitive radio networks (CRNs) can be integrated with cloud platform. One of the important issues in CRNs is security. It becomes a problem since CRNs use radio frequencies as a medium for transmitting and CRNs share the same issues with wireless communication systems. Another critical issue in CRNs is performance. Security has adverse effect to performance and there are trade-offs between them. The goal of this paper is to investigate the performance related to security trade-off in CRNs with supporting cloud platforms. Furthermore, Queuing Network Models with preemptive resume and preemptive repeat identical priority are applied in this project to measure the impact of security to performance in CRNs with or without cloud platform. The generalized exponential (GE) type distribution is used to reflect the bursty inter-arrival and service times at the servers. The results show that the best performance is obtained when security is disable and cloud platform is enable.

Keywords: performance vs. security, cognitive radio networks, cloud platforms, GE-type distribution

Procedia PDF Downloads 326
8626 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.

Keywords: censored data, R statistical software, reliability analysis, time to failure

Procedia PDF Downloads 382
8625 A Model of Applied Psychology Research Defining Community Participation and Collective Identity as a Major Asset for Strategic Planning and Political Decision: The Project SIA (Social Inclusion through Accessibility)

Authors: Rui Serôdio, Alexandra Serra, José Albino Lima, Luísa Catita, Paula Lopes

Abstract:

We will present the outline of the Project SIA (Social Inclusion through Accessibility) focusing in one of its core components: how our applied research model contributes to define community participation as a pillar for strategic and political agenda amongst local authorities. Project ISA, supported by EU regional funding, was design as part of a broader model developed by SIMLab–Social Inclusion Monitoring Laboratory, in which the relation University-Community is a core element. The project illustrates how University of Porto developed a large scale project of applied psychology research in a close partnership with 18 municipalities that cover almost all regions of Portugal, and with a private architecture enterprise, specialized in inclusive accessibility and “design for all”. Three fundamental goals were defined: (1) creation of a model that would promote the effective civic participation of local citizens; (2) the “voice” of such participation should be both individual and collective; (3) the scientific and technical framework should serve as one of the bases for political decision on inclusive accessibility local planning. The two main studies were run in a standardized model across all municipalities and the samples of the three modalities of community participation were the following: individual participation based on 543 semi-structured interviews and 6373 inquiries; collective participation based on group session with 302 local citizens. We present some of the broader findings of Project SIA and discuss how they relate to our applied research model.

Keywords: applied psychology, collective identity, community participation, inclusive accessibility

Procedia PDF Downloads 417
8624 Extended Kalman Filter and Markov Chain Monte Carlo Method for Uncertainty Estimation: Application to X-Ray Fluorescence Machine Calibration and Metal Testing

Authors: S. Bouhouche, R. Drai, J. Bast

Abstract:

This paper is concerned with a method for uncertainty evaluation of steel sample content using X-Ray Fluorescence method. The considered method of analysis is a comparative technique based on the X-Ray Fluorescence; the calibration step assumes the adequate chemical composition of metallic analyzed sample. It is proposed in this work a new combined approach using the Kalman Filter and Markov Chain Monte Carlo (MCMC) for uncertainty estimation of steel content analysis. The Kalman filter algorithm is extended to the model identification of the chemical analysis process using the main factors affecting the analysis results; in this case, the estimated states are reduced to the model parameters. The MCMC is a stochastic method that computes the statistical properties of the considered states such as the probability distribution function (PDF) according to the initial state and the target distribution using Monte Carlo simulation algorithm. Conventional approach is based on the linear correlation, the uncertainty budget is established for steel Mn(wt%), Cr(wt%), Ni(wt%) and Mo(wt%) content respectively. A comparative study between the conventional procedure and the proposed method is given. This kind of approaches is applied for constructing an accurate computing procedure of uncertainty measurement.

Keywords: Kalman filter, Markov chain Monte Carlo, x-ray fluorescence calibration and testing, steel content measurement, uncertainty measurement

Procedia PDF Downloads 261
8623 The Acoustic Performance of Double-skin Wind Energy Facade

Authors: Sara Mota Carmo

Abstract:

Wind energy applied in architecture has been largely abandoned due to the uncomfortable noise it causes. This study aims to investigate the acoustical performance in the urban environment and indoor environment of a double-skin wind energy facade. Measurements for sound transmission were recorded by using a hand-held sound meter device on a reduced-scale prototype of a wind energy façade. The applied wind intensities ranged between 2m/s and 8m/s, and the increase sound produced were proportional to the wind intensity.The study validates the acoustic performance of wind energy façade using a double skin façade system, showing that noise reduction indoor by approximately 30 to 35 dB. However, the results found that above 6m/s win intensity, in urban environment, the wind energy system applied to the façade exceeds the maximum 50dB recommended by world health organization and needs some adjustments.

Keywords: double-skin wind energy facade, acoustic energy facade, wind energy in architecture, wind energy prototype

Procedia PDF Downloads 71
8622 Cloud Resources Utilization and Science Teacher’s Effectiveness in Secondary Schools in Cross River State, Nigeria

Authors: Michael Udey Udam

Abstract:

Background: This study investigated the impact of cloud resources, a component of cloud computing, on science teachers’ effectiveness in secondary schools in Cross River State. Three (3) research questions and three (3) alternative hypotheses guided the study. Method: The descriptive survey design was adopted for the study. The population of the study comprised 1209 science teachers in public secondary schools of Cross River state. Sample: A sample of 487 teachers was drawn from the population using a stratified random sampling technique. The researcher-made structured questionnaire with 18 was used for data collection for the study. Research question one was answered using the Pearson Product Moment Correlation, while research question two and the hypotheses were answered using the Analysis of Variance (ANOVA) statistics in the Statistical Package for Social Sciences (SPSS) at a 0.05 level of significance. Results: The results of the study revealed that there is a positive correlation between the utilization of cloud resources in teaching and teaching effectiveness among science teachers in secondary schools in Cross River state; there is a negative correlation between gender and utilization of cloud resources among science teachers in secondary schools in Cross River state; and that there is a significant correlation between teaching experience and the utilization of cloud resources among science teachers in secondary schools in Cross River state. Conclusion: The study justifies the effectiveness of the Cross River state government policy of introducing cloud computing into the education sector. The study recommends that the policy should be sustained.

Keywords: cloud resources, science teachers, effectiveness, secondary school

Procedia PDF Downloads 46
8621 The Potential of Fly Ash Wastes to Improve Nutrient Levels in Agricultural Soils: A Material Flow Analysis Case Study from Riau District, Indonesia

Authors: Hasan Basri Jumin

Abstract:

Fly ash sewage of pulp and paper industries when processed with suitable process and true management may possibly be used fertilizer agriculture purposes. The objective of works is to evaluate re-cycling possibility of fly ash waste to be applied as a fertilizer for agriculture use. Fly ash sewage was applied to maize with 28 g/plant could be increased significantly the average of dry weigh from dry weigh of seed increase from 6.7 g/plant into 10.3 g/plant, and net assimilation rates could be increased from 14.5 mg.m-2.day-1 into 35.4 mg.m-2 day-1. Therefore, production per hectare was reached 3.2 ton/ha. The chemical analyses of fly ash waste indicated that, there are no exceed threshold content of dangerous metals and biology effects. Mercury, arsenic, cadmium, chromium, cobalt, lead, and molybdenum contents as heavy metal are lower than the threshold of human healthy tolerance. Therefore, it has no syndrome effect to human health. This experiment indicated that fly ash sewage in lower doses until 28 g/plant could be applied as substitution fertilizer for agriculture use and it could be eliminate the environment pollution.

Keywords: fly-ash, fertilizer, maize, sludge-sewage pollutant, waste

Procedia PDF Downloads 563
8620 An Adjoint-Based Method to Compute Derivatives with Respect to Bed Boundary Positions in Resistivity Measurements

Authors: Mostafa Shahriari, Theophile Chaumont-Frelet, David Pardo

Abstract:

Resistivity measurements are used to characterize the Earth’s subsurface. They are categorized into two different groups: (a) those acquired on the Earth’s surface, for instance, controlled source electromagnetic (CSEM) and Magnetotellurics (MT), and (b) those recorded with borehole logging instruments such as Logging-While-Drilling (LWD) devices. LWD instruments are mostly used for geo-steering purposes, i.e., to adjust dip and azimuthal angles of a well trajectory to drill along a particular geological target. Modern LWD tools measure all nine components of the magnetic field corresponding to three orthogonal transmitter and receiver orientations. In order to map the Earth’s subsurface and perform geo-steering, we invert measurements using a gradient-based method that utilizes the derivatives of the recorded measurements with respect to the inversion variables. For resistivity measurements, these inversion variables are usually the constant resistivity value of each layer and the bed boundary positions. It is well-known how to compute derivatives with respect to the constant resistivity value of each layer using semi-analytic or numerical methods. However, similar formulas for computing the derivatives with respect to bed boundary positions are unavailable. The main contribution of this work is to provide an adjoint-based formulation for computing derivatives with respect to the bed boundary positions. The key idea to obtain the aforementioned adjoint state formulations for the derivatives is to separate the tangential and normal components of the field and treat them differently. This formulation allows us to compute the derivatives faster and more accurately than with traditional finite differences approximations. In the presentation, we shall first derive a formula for computing the derivatives with respect to the bed boundary positions for the potential equation. Then, we shall extend our formulation to 3D Maxwell’s equations. Finally, by considering a 1D domain and reducing the dimensionality of the problem, which is a common practice in the inversion of resistivity measurements, we shall derive a formulation to compute the derivatives of the measurements with respect to the bed boundary positions using a 1.5D variational formulation. Then, we shall illustrate the accuracy and convergence properties of our formulations by comparing numerical results with the analytical derivatives for the potential equation. For the 1.5D Maxwell’s system, we shall compare our numerical results based on the proposed adjoint-based formulation vs those obtained with a traditional finite difference approach. Numerical results shall show that our proposed adjoint-based technique produces enhanced accuracy solutions while its cost is negligible, as opposed to the finite difference approach that requires the solution of one additional problem per derivative.

Keywords: inverse problem, bed boundary positions, electromagnetism, potential equation

Procedia PDF Downloads 159
8619 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 44
8618 Contraction and Membrane Potential of C2C12 with GTXs

Authors: Bayan Almofty, Yuto Yamaki, Tadamasa Terai, Sadahito Uto

Abstract:

Culture techniques of skeletal muscle cells are advanced in the field of regenerative medicine and applied research of cultured muscle. As applied research of cultured muscle, myopathy (muscles disease) treatment is expected and development bio of actuator is also expected in biomedical engineering. Grayanotoxins (GTXs) is known as neurotoxins that enhance the permeability of cell membrane for Na ions. Grayanotoxins are extracted from a famous Pieris japonica and Ericaceae as well as a phytotoxin. In this study, we investigated the effect of GTXs on muscle cells (C2C12) contraction and membrane potential. Contraction of myotubes is induced by applied external electrical stimulation. Contraction and membrane potential change of skeletal muscle cells are induced by injection of current. We, therefore, concluded that effect of Grayanotoxins on contraction and membrane potential of C2C12 relate to acute toxicity of GTXs.

Keywords: skeletal muscle cells C2C12, grayanotoxins, contraction, membrane potential, acute toxicity, pytotoxin, motubes

Procedia PDF Downloads 487
8617 Response of Wheat and Lentil to Herbicides Applied in the Preceding Non-Puddled Transplanted Rainy Season Rice

Authors: Taslima Zahan

Abstract:

A field study was done in 2013-14 and 2014-15 by following bio-assay technique to determine the carryover effect of herbicides applied in rainy season rice on growth and yield of two probable succeeding crops of rice viz., wheat and lentil. Rice seedlings were transplanted on strip-tilled non-puddled field, and five herbicides named pyrazosufuron-ethyl, butachlor, orthosulfamuron, butachlor + propanil and 2,4-D amine were applied in rice at their recommended rate and time as eight treatment combinations and compared with one untreated control. Residual effects of those rice herbicides on the succeeding wheat and lentil were examined by following micro-plot bioassay technique. The study revealed that germination of wheat and lentil seeds were not affected by the residue of herbicides applied in the preceding rainy season rice. Shoot length of wheat and lentil seedlings of herbicide treated plots were also non-significantly varied with untreated control plots. Herbicide treated plots of wheat had higher leaf chlorophyll contents over the control plots by 1.8-14.0% on an average while in case of lentil herbicide treated plots had negligible amount of reduction in leaf chlorophyll contents than control plots. Grain yields of wheat and lentil in herbicide treated plots were higher than control plots by 2.8-6.6% and 0.2-10.9%, respectively. Therefore, two-year bioassay study claimed that tested herbicides applied in rainy season rice under strip-tilled non-puddled field had no adverse residual effect on growth and yield of the succeeding wheat and lentil.

Keywords: crop sensitivity, herbicide persistence, minimum tillage rice, yield improvement

Procedia PDF Downloads 141
8616 Development of Bicomponent Fibre to Combat Insects

Authors: M. Bischoff, F. Schmidt, J. Herrmann, J. Mattheß, G. Seide, T. Gries

Abstract:

Crop yields have not increased as dramatically as the demand for food. One method to counteract this is to use pesticides to keep away predators, e.g. several forms of insecticide are available to fight insects. These insecticides and pesticides are both controversial as their application and their residue in the food product can also harm humans. In this study an alternative method to combat insects is studied. A physical insect-killing effect of SiO2 particles is used. The particles are applied on fibres to avoid erosion in the fields, which would occur when applied separately. The development of such SiO2 functionalized PP fibres is shown.

Keywords: agriculture, environment, insects, protection, silica, textile

Procedia PDF Downloads 273
8615 Yawning Computing Using Bayesian Networks

Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube

Abstract:

Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.

Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms

Procedia PDF Downloads 437
8614 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea

Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti

Abstract:

This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool.  Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.

Keywords: 4D modelling, Black Sea Maritime Archaeology Project, multi-source photogrammetry, low visibility underwater survey

Procedia PDF Downloads 212
8613 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 438
8612 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon

Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn

Abstract:

The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.

Keywords: land use and land cover change, change detection, image processing, support vector machines

Procedia PDF Downloads 103
8611 Structure Clustering for Milestoning Applications of Complex Conformational Transitions

Authors: Amani Tahat, Serdal Kirmizialtin

Abstract:

Trajectory fragment methods such as Markov State Models (MSM), Milestoning (MS) and Transition Path sampling are the prime choice of extending the timescale of all atom Molecular Dynamics simulations. In these approaches, a set of structures that covers the accessible phase space has to be chosen a priori using cluster analysis. Structural clustering serves to partition the conformational state into natural subgroups based on their similarity, an essential statistical methodology that is used for analyzing numerous sets of empirical data produced by Molecular Dynamics (MD) simulations. Local transition kernel among these clusters later used to connect the metastable states using a Markovian kinetic model in MSM and a non-Markovian model in MS. The choice of clustering approach in constructing such kernel is crucial since the high dimensionality of the biomolecular structures might easily confuse the identification of clusters when using the traditional hierarchical clustering methodology. Of particular interest, in the case of MS where the milestones are very close to each other, accurate determination of the milestone identity of the trajectory becomes a challenging issue. Throughout this work we present two cluster analysis methods applied to the cis–trans isomerism of dinucleotide AA. The choice of nucleic acids to commonly used proteins to study the cluster analysis is two fold: i) the energy landscape is rugged; hence transitions are more complex, enabling a more realistic model to study conformational transitions, ii) Nucleic acids conformational space is high dimensional. A diverse set of internal coordinates is necessary to describe the metastable states in nucleic acids, posing a challenge in studying the conformational transitions. Herein, we need improved clustering methods that accurately identify the AA structure in its metastable states in a robust way for a wide range of confused data conditions. The single linkage approach of the hierarchical clustering available in GROMACS MD-package is the first clustering methodology applied to our data. Self Organizing Map (SOM) neural network, that also known as a Kohonen network, is the second data clustering methodology. The performance comparison of the neural network as well as hierarchical clustering method is studied by means of computing the mean first passage times for the cis-trans conformational rates. Our hope is that this study provides insight into the complexities and need in determining the appropriate clustering algorithm for kinetic analysis. Our results can improve the effectiveness of decisions based on clustering confused empirical data in studying conformational transitions in biomolecules.

Keywords: milestoning, self organizing map, single linkage, structure clustering

Procedia PDF Downloads 201
8610 Lightning Protection Design Applied to Sustainable Development

Authors: Sylvain Fauveaux, T. Nowicki

Abstract:

Lightning protection is nowadays applied worldwide since the advent of international standards. Lightning protection is widely justified by the casualties and damages involved. As a matter of fact, the lightning business is constantly growing as more and more sensible areas need to be protected. However, the worldwide demand of copper materiel is increasing as well, its price too. Furthermore, the most frequently used method of protection is consuming a lot of copper. The copper production is also consuming a large amount of natural and power resources, not to mention the ecologic balance.

Keywords: ESEAT, Lightning protection , natural resources management, NF C 17-102, sustainable development

Procedia PDF Downloads 135
8609 Wear and Fraction Behavior of Porcelain Coated with Polyurethane/SiO2 Coating Layer

Authors: Ching Yern Chee

Abstract:

Various loading of nano silica is added into polyurethane (PU) and then coated on porcelain substrate. The wear and friction properties of the porcelain substrates coated with polyurethane/nano silica nano composite coatings were investigated using the reciprocating wear testing machine. The friction and wear test of polyurethane/nano silica coated porcelain substrate was studied at different sliding speed and applied load. It was found that the optimum composition of nano silica is 3 wt% which gives the lowest friction coefficient and wear rate in all applied load ranges and sliding speeds. For 3 wt% nano silica filled PU coated porcelain substrate, the increment of sliding speed caused higher wear rates but lower frictions coefficient. Besides, the friction coefficient of nano silica filled PU coated porcelain substrate decreased but the wear rate increased with the applied load.

Keywords: porcelain, nanocomposite coating, morphology, friction, wear behavior

Procedia PDF Downloads 506
8608 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 55
8607 Applications of Nanoparticles via Laser Ablation in Liquids: A Review

Authors: Fawaz M. Abdullah, Abdulrahman M. Al-Ahmari, Madiha Rafaqat

Abstract:

Laser ablation of any solid target in the liquid leads to fabricate nanoparticles (NPs) with metal or different compositions of materials such as metals, alloys, oxides, carbides, hydroxides. The fabrication of NPs in liquids based on laser ablation has grown up rapidly in the last decades compared to other techniques. Nowadays, laser ablation has been improved to prepare different types of NPs with special morphologies, microstructures, phases, and sizes, which can be applied in various fields. The paper reviews and highlights the different sizes, shapes and application field of nanoparticles that are produced by laser ablation under different liquids and materials. Also, the paper provides a case study for producing a titanium NPs produced by laser ablation submerged in distilled water. The size of NPs is an important parameter, especially for their usage and applications. The size and shape have been analyzed by SEM, (EDAX) was applied to evaluate the oxidation and elements of titanium NPs and the XRD was used to evaluate the phase composition and the peaks of both titanium and some element. SEM technique showed that the synthesized NPs size ranges were between 15-35 nm which can be applied in various field such as annihilator for cancerous cell etc.

Keywords: nanoparticles, laser ablation, titanium NPs, applications

Procedia PDF Downloads 119
8606 Spectral Analysis Applied to Variables of Oil Wells Profiling

Authors: Suzana Leitão Russo, Mayara Laysa de Oliveira Silva, José Augusto Andrade Filho, Vitor Hugo Simon

Abstract:

Currently, seismic methods and prospecting methods are commonly applied in the oil industry and, according to the information reported every day; oil is a source of non-renewable energy. It is easier to understand why the ownership of areas of oil extraction is coveted by many nations. It is necessary to think about ways that will enable the maximization of oil production. The technique of spectral analysis can be used to analyze the behavior of the variables already defined in oil well the profile. The main objective is to verify the series dependence of variables, and to model the variables using the frequency domain to observe the model residuals.

Keywords: oil, well, spectral analysis, oil extraction

Procedia PDF Downloads 508
8605 Application of Transform Fourier for Dynamic Control of Structures with Global Positioning System

Authors: J. M. de Luis Ruiz, P. M. Sierra García, R. P. García, R. P. Álvarez, F. P. García, E. C. López

Abstract:

Given the evolution of viaducts, structural health monitoring requires more complex techniques to define their state. two alternatives can be distinguished: experimental and operational modal analysis. Although accelerometers or Global Positioning System (GPS) have been applied for the monitoring of structures under exploitation, the dynamic monitoring during the stage of construction is not common. This research analyzes whether GPS data can be applied to certain dynamic geometric controls of evolving structures. The fundamentals of this work were applied to the New Bridge of Cádiz (Spain), a worldwide milestone in bridge building. GPS data were recorded with an interval of 1 second during the erection of segments and turned to the frequency domain with Fourier transform. The vibration period and amplitude were contrasted with those provided by the finite element model, with differences of less than 10%, which is admissible. This process provides a vibration record of the structure with GPS, avoiding specific equipment.

Keywords: Fourier transform, global position system, operational modal analysis, structural health monitoring

Procedia PDF Downloads 219
8604 BeamGA Median: A Hybrid Heuristic Search Approach

Authors: Ghada Badr, Manar Hosny, Nuha Bintayyash, Eman Albilali, Souad Larabi Marie-Sainte

Abstract:

The median problem is significantly applied to derive the most reasonable rearrangement phylogenetic tree for many species. More specifically, the problem is concerned with finding a permutation that minimizes the sum of distances between itself and a set of three signed permutations. Genomes with equal number of genes but different order can be represented as permutations. In this paper, an algorithm, namely BeamGA median, is proposed that combines a heuristic search approach (local beam) as an initialization step to generate a number of solutions, and then a Genetic Algorithm (GA) is applied in order to refine the solutions, aiming to achieve a better median with the smallest possible reversal distance from the three original permutations. In this approach, any genome rearrangement distance can be applied. In this paper, we use the reversal distance. To the best of our knowledge, the proposed approach was not applied before for solving the median problem. Our approach considers true biological evolution scenario by applying the concept of common intervals during the GA optimization process. This allows us to imitate a true biological behavior and enhance genetic approach time convergence. We were able to handle permutations with a large number of genes, within an acceptable time performance and with same or better accuracy as compared to existing algorithms.

Keywords: median problem, phylogenetic tree, permutation, genetic algorithm, beam search, genome rearrangement distance

Procedia PDF Downloads 246
8603 Forced Degradation Study of Rifaximin Formulated Tablets to Determine Stability Indicating Nature of High-Performance Liquid Chromatography Analytical Method

Authors: Abid Fida Masih

Abstract:

Forced degradation study of Rifaximin was conducted to determine the stability indicating potential of HPLC testing method for detection of Rifaximin in formulated tablets to be employed for quality control and stability testing. The questioned method applied with mobile phase methanol: water (70:30), 5µm, 250 x 4.6mm, C18 column, wavelength 293nm and flow rate of 1.0 ml/min. Forced degradation study was performed under oxidative, acidic, basic, thermal and photolytic conditions. The applied method successfully determined the degradation products after acidic and basic degradation without interfering with Rifaximin detection. Therefore, the method was said to be stability indicating and can be applied for quality control and stability testing of Rifaxmin tablets during its shelf life.

Keywords: forced degradation, high-performance liquid chromatography, method validation, rifaximin, stability indicating method

Procedia PDF Downloads 270
8602 Study of the Effect of Seismic Behavior of Twin Tunnels Position on Each Other

Authors: M. Azadi, M. Kalhor

Abstract:

Excavation of shallow tunnels such as subways in urban areas plays a significant role as a life line and investigation of the soil behavior against tunnel construction is one of the vital subjects studied in the geotechnical scope. Nowadays, urban tunnels are mostly drilled by T.B.Ms and changing the applied forces to tunnel lining is one of the most risky matters while drilling tunnels by these machines. Variation of soil cementation can change the behavior of these forces in the tunnel lining. Therefore, this article is designed to assess the impact of tunnel excavation in different soils and several amounts of cementation on applied loads to tunnel lining under static and dynamic loads. According to the obtained results, changing the cementation of soil will affect the applied loadings to the tunnel envelope significantly. It can be determined that axial force in tunnel lining decreases considerably when soil cementation increases. Also, bending moment and shear force in tunnel lining decreases as the soil cementation increases and causes bending and shear behavior of the segments to improve. Based on the dynamic analyses, as cohesion factor in soil increases, bending moment, axial and shear forces of segments decrease but lining behavior of the tunnel is the same as static state. The results show that decreasing the overburden applied to lining caused by cementation is different in two static and dynamic states.

Keywords: seismic behavior, twin tunnels, tunnel positions, TBM, optimum distance

Procedia PDF Downloads 269
8601 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: reliability approach, storage tanks, monte carlo simulation, seismic acceleration

Procedia PDF Downloads 282
8600 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 90
8599 3 Phase Induction Motor Control Using Single Phase Input and GSM

Authors: Pooja S. Billade, Sanjay S. Chopade

Abstract:

This paper focuses on the design of three phase induction motor control using single phase input and GSM.The controller used in this work is a wireless speed control using a GSM technique that proves to be very efficient and reliable in applications.The most common principle is the constant V/Hz principle which requires that the magnitude and frequency of the voltage applied to the stator of a motor maintain a constant ratio. By doing this, the magnitude of the magnetic field in the stator is kept at an approximately constant level throughout the operating range. Thus, maximum constant torque producing capability is maintained. The energy that a switching power converter delivers to a motor is controlled by Pulse Width Modulated signals applied to the gates of the power transistors in H-bridge configuration. PWM signals are pulse trains with fixed frequency and magnitude and variable pulse width. When a PWM signal is applied to the gate of a power transistor, it causes the turn on and turns off intervals of the transistor to change from one PWM period.

Keywords: index terms— PIC, GSM (global system for mobile), LCD (Liquid Crystal Display), IM (Induction Motor)

Procedia PDF Downloads 424