Search results for: heuristics and biases approach.
807 Numerical Simulation of unsteady MHD Flow and Heat Transfer of a Second Grade Fluid with Viscous Dissipation and Joule Heating using Meshfree Approach
Authors: R. Bhargava, Sonam Singh
Abstract:
In the present study, a numerical analysis is carried out to investigate unsteady MHD (magneto-hydrodynamic) flow and heat transfer of a non-Newtonian second grade viscoelastic fluid over an oscillatory stretching sheet. The flow is induced due to an infinite elastic sheet which is stretched oscillatory (back and forth) in its own plane. Effect of viscous dissipation and joule heating are taken into account. The non-linear differential equations governing the problem are transformed into system of non-dimensional differential equations using similarity transformations. A newly developed meshfree numerical technique Element free Galerkin method (EFGM) is employed to solve the coupled non linear differential equations. The results illustrating the effect of various parameters like viscoelastic parameter, Hartman number, relative frequency amplitude of the oscillatory sheet to the stretching rate and Eckert number on velocity and temperature field are reported in terms of graphs and tables. The present model finds its application in polymer extrusion, drawing of plastic films and wires, glass, fiber and paper production etc.Keywords: EFGM, MHD, Oscillatory stretching sheet, Unsteady, Viscoelastic
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898806 The Role of Velocity Map Quality in Estimation of Intravascular Pressure Distribution
Authors: Ali Pashaee, Parisa Shooshtari, Gholamreza Atae, Nasser Fatouraee
Abstract:
Phase-Contrast MR imaging methods are widely used for measurement of blood flow velocity components. Also there are some other tools such as CT and Ultrasound for velocity map detection in intravascular studies. These data are used in deriving flow characteristics. Some clinical applications are investigated which use pressure distribution in diagnosis of intravascular disorders such as vascular stenosis. In this paper an approach to the problem of measurement of intravascular pressure field by using velocity field obtained from flow images is proposed. The method presented in this paper uses an algorithm to calculate nonlinear equations of Navier- Stokes, assuming blood as an incompressible and Newtonian fluid. Flow images usually suffer the lack of spatial resolution. Our attempt is to consider the effect of spatial resolution on the pressure distribution estimated from this method. In order to achieve this aim, velocity map of a numerical phantom is derived at six different spatial resolutions. To determine the effects of vascular stenoses on pressure distribution, a stenotic phantom geometry is considered. A comparison between the pressure distribution obtained from the phantom and the pressure resulted from the algorithm is presented. In this regard we also compared the effects of collocated and staggered computational grids on the pressure distribution resulted from this algorithm.Keywords: Flow imaging, pressure distribution estimation, phantom, resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1682805 A Case Study on Barriers in Total Productive Maintenance Implementation in the Abu Dhabi Power Industry
Authors: A. Alseiari, P. Farrell
Abstract:
Maintenance has evolved into an imperative function, and contributes significantly to efficient and effective equipment performance. Total Productive Maintenance (TPM) is an ideal approach to support the development and implementation of operation performance improvement. It systematically aims to understand the function of equipment, the service quality relationship with equipment and the probable critical equipment failure conditions. Implementation of TPM programmes need strategic planning and there has been little research applied in this area within Middle-East power plants. In the power sector of Abu Dhabi, technologically and strategically, the power industry is extremely important, and it thus needs effective and efficient equipment management support. The aim of this paper is to investigate barriers to successful TPM implementation in the Abu Dhabi power industry. The study has been conducted in the context of a leading power company in the UAE. Semi-structured interviews were conducted with 16 employees, including maintenance and operation staff, and senior managers. The findings of this research identified seven key barriers, thus: managerial; organisational; cultural; financial; educational; communications; and auditing. With respect to the understanding of these barriers and obstacles in TPM implementation, the findings can contribute towards improved equipment operations and maintenance in power organisations.
Keywords: Abu Dhabi power industry, TPM implementation, key barriers, organisational culture, critical success factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778804 Gaits Stability Analysis for a Pneumatic Quadruped Robot Using Reinforcement Learning
Authors: Soofiyan Atar, Adil Shaikh, Sahil Rajpurkar, Pragnesh Bhalala, Aniket Desai, Irfan Siddavatam
Abstract:
Deep reinforcement learning (deep RL) algorithms leverage the symbolic power of complex controllers by automating it by mapping sensory inputs to low-level actions. Deep RL eliminates the complex robot dynamics with minimal engineering. Deep RL provides high-risk involvement by directly implementing it in real-world scenarios and also high sensitivity towards hyperparameters. Tuning of hyperparameters on a pneumatic quadruped robot becomes very expensive through trial-and-error learning. This paper presents an automated learning control for a pneumatic quadruped robot using sample efficient deep Q learning, enabling minimal tuning and very few trials to learn the neural network. Long training hours may degrade the pneumatic cylinder due to jerk actions originated through stochastic weights. We applied this method to the pneumatic quadruped robot, which resulted in a hopping gait. In our process, we eliminated the use of a simulator and acquired a stable gait. This approach evolves so that the resultant gait matures more sturdy towards any stochastic changes in the environment. We further show that our algorithm performed very well as compared to programmed gait using robot dynamics.
Keywords: model-based reinforcement learning, gait stability, supervised learning, pneumatic quadruped
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 587803 Suitable Partner Node Selection and Resource Allocation in Cooperative Wireless Communication Using the Trade-Off Game
Authors: Oluseye A. Adeleke, Mohd. F. M. Salleh
Abstract:
The performance of any cooperative communication system depends largely on the selection of a proper partner. Another important factor to consider is an efficient allocation of resource like power by the source node to help it in forwarding information to the destination. In this paper, we look at the concepts of partner selection and resource (power) allocation for a distributed communication network. A type of non-cooperative game referred to as Trade-Off game is employed so as to jointly consider the utilities of the source and relay nodes, where in this case, the source is the node that requires help with forwarding of its information while the partner is the node that is willing to help in forwarding the source node’s information, but at a price. The approach enables the source node to maximize its utility by selecting a partner node based on (i) the proximity of the partner node to the source and destination nodes, and (ii) the price the partner node will charge for the help being rendered. Our proposed scheme helps the source locate and select the relay nodes at ‘better’ locations and purchase power optimally from them. It also aids the contending relay nodes maximize their own utilities as well by asking proper prices. Our game scheme is seen to converge to unique equilibrium.
Keywords: Cooperative communication, game theory, node, power allocation, trade-off, utility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938802 Stability and Kinetic Analysis during Vermicomposting of Sewage Sludge
Authors: Ashish Kumar Nayak, Dhamodharan K., Ajay S. Kalamdhad
Abstract:
The present study is aimed at alteration of sewage sludge into stable compost product using vermicomposting of sewage sludge mixed with cattle manure and saw dust in five different proportions based on C/N ratios (C/N 15 (R1), 20 (R2), 25 (R3) and 30 (R4); and control (R5)) by employing an epigeic earthworm Eisenia fetida. Higher reductions in C/N ratio, CO2 evolution and OUR were observed in R4 demonstrated the compost stability. In addition, R4 proved to be best combination for the growth of the earthworms. In order to observe the optimal degradation, kinetics for degradation of organic matter in vermicomposting were quantitatively evaluated. An approach model was developed by assuming that composting process is carried out in a homogeneous way and the kinetics for decomposition reaction is represented by a Monod-type equation. The results exhibit comparable variations in the kinetic constants Km and K3 under varying parameters during vermicomposting process. Results suggested that higher R2 value in R4, enhanced suitability towards Lineweaver-Burke plot. R4 yields higher degradability coefficient (K) reveals that the occurrence of optimal nutrient balance, which not only enhanced the affinity of enzymes towards substrate but also improved its degradation process. Therefore, it can be proved that R4 provided to be the best feed combination for vermicomposting process as compared to other reactors.
Keywords: Vermicomposting, Eisenia fetida, Sewage sludge, C/N ratio, Stability, Enzyme kinetics concept.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2350801 User-Driven Product Line Engineering for Assembling Large Families of Software
Authors: Zhaopeng Xuan, Yuan Bian, C. Cailleaux, Jing Qin, S. Traore
Abstract:
Traditional software engineering allows engineers to propose to their clients multiple specialized software distributions assembled from a shared set of software assets. The management of these assets however requires a trade-off between client satisfaction and software engineering process. Clients have more and more difficult to find a distribution or components based on their needs from all of distributed repositories.
This paper proposes a software engineering for a user-driven software product line in which engineers define a Feature Model but users drive the actual software distribution on demand. This approach makes the user become final actor as a release manager in software engineering process, increasing user product satisfaction and simplifying user operations to find required components. In addition, it provides a way for engineers to manage and assembly large software families.
As a proof of concept, a user-driven software product line is implemented for Eclipse, an integrated development environment. An Eclipse feature model is defined, which is exposed to users on a cloud-based built platform from which clients can download individualized Eclipse distributions.
Keywords: Software Product Line, Model-driven Development, Reverse Engineering and Refactoring, Agile Method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831800 Selection of Designs in Ordinal Regression Models under Linear Predictor Misspecification
Authors: Ishapathik Das
Abstract:
The purpose of this article is to find a method of comparing designs for ordinal regression models using quantile dispersion graphs in the presence of linear predictor misspecification. The true relationship between response variable and the corresponding control variables are usually unknown. Experimenter assumes certain form of the linear predictor of the ordinal regression models. The assumed form of the linear predictor may not be correct always. Thus, the maximum likelihood estimates (MLE) of the unknown parameters of the model may be biased due to misspecification of the linear predictor. In this article, the uncertainty in the linear predictor is represented by an unknown function. An algorithm is provided to estimate the unknown function at the design points where observations are available. The unknown function is estimated at all points in the design region using multivariate parametric kriging. The comparison of the designs are based on a scalar valued function of the mean squared error of prediction (MSEP) matrix, which incorporates both variance and bias of the prediction caused by the misspecification in the linear predictor. The designs are compared using quantile dispersion graphs approach. The graphs also visually depict the robustness of the designs on the changes in the parameter values. Numerical examples are presented to illustrate the proposed methodology.Keywords: Model misspecification, multivariate kriging, multivariate logistic link, ordinal response models, quantile dispersion graphs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1002799 Surface Characteristics of Bacillus megaterium and Its Adsorption Behavior onto Dolomite
Authors: Mohsen Farahat, Tsuyoshi Hirajima
Abstract:
Surface characteristics of Bacillus megaterium strain were investigated; zeta potential, FTIR and contact angle were measured. Surface energy components including Lifshitz-van der Waals, Hamaker constant, and acid/base components (Lewis acid/Lewis base) were calculated from the contact angle data. The results showed that the microbial cells were negatively charged over all pH regions with high values at alkaline region. A hydrophilic nature for the strain was confirmed by contact angle and free energy of adhesion between microbial cells. Adsorption affinity of the strain toward dolomite was studied at different pH values. The results showed that the cells had a high affinity to dolomite at acid pH comparing to neutral and alkaline pH. Extended DLVO theory was applied to calculate interaction energy between B. megaterium cells and dolomite particles. The adsorption results were in agreement with the results of Extended DLVO approach. Surface changes occurred on dolomite surface after the bio-treatment were monitored; contact angle decreased from 69° to 38° and the mineral’s floatability decreased from 95% to 25% after the treatment.Keywords: Bacillus megaterium, surface modification, flotation, dolomite, adhesion energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011798 Adaptive Block State Update Method for Separating Background
Authors: Youngsuck Ji, Youngjoon Han, Hernsoo Hahn
Abstract:
In this paper, we proposed the robust mobile object detection method for light effect in the night street image block based updating reference background model using block state analysis. Experiment image is acquired sequence color video from steady camera. When suddenly appeared artificial illumination, reference background model update this information such as street light, sign light. Generally natural illumination is change by temporal, but artificial illumination is suddenly appearance. So in this paper for exactly detect artificial illumination have 2 state process. First process is compare difference between current image and reference background by block based, it can know changed blocks. Second process is difference between current image-s edge map and reference background image-s edge map, it possible to estimate illumination at any block. This information is possible to exactly detect object, artificial illumination and it was generating reference background more clearly. Block is classified by block-state analysis. Block-state has a 4 state (i.e. transient, stationary, background, artificial illumination). Fig. 1 is show characteristic of block-state respectively [1]. Experimental results show that the presented approach works well in the presence of illumination variance.Keywords: Block-state, Edge component, Reference backgroundi, Artificial illumination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321797 Optimization of Process Parameters of Pressure Die Casting using Taguchi Methodology
Authors: Satish Kumar, Arun Kumar Gupta, Pankaj Chandna
Abstract:
The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.
Keywords: Aluminium Casting, Pressure Die Casting, Taguchi Methodology, Design of Experiments
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7335796 SIFT Accordion: A Space-Time Descriptor Applied to Human Action Recognition
Authors: Olfa.Ben Ahmed, Mahmoud. Mejdoub, Chokri. Ben Amar
Abstract:
Recognizing human action from videos is an active field of research in computer vision and pattern recognition. Human activity recognition has many potential applications such as video surveillance, human machine interaction, sport videos retrieval and robot navigation. Actually, local descriptors and bag of visuals words models achieve state-of-the-art performance for human action recognition. The main challenge in features description is how to represent efficiently the local motion information. Most of the previous works focus on the extension of 2D local descriptors on 3D ones to describe local information around every interest point. In this paper, we propose a new spatio-temporal descriptor based on a spacetime description of moving points. Our description is focused on an Accordion representation of video which is well-suited to recognize human action from 2D local descriptors without the need to 3D extensions. We use the bag of words approach to represent videos. We quantify 2D local descriptor describing both temporal and spatial features with a good compromise between computational complexity and action recognition rates. We have reached impressive results on publicly available action data setKeywords: Accordion, Bag of Features, Human action, Motion, Moving point, Space-Time Descriptor, SIFT, Video.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108795 Dynamic Metadata Schemes in the Neutron and Photon Science Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata is one of the most important aspects for advancing data management practices within all research communities. Definitions and schemes of metadata are inter alia of particular significance in the domain of neutron and photon scattering experiments covering a broad area of different scientific disciplines. The demand of describing continuously evolving highly non-standardized experiments, including the resulting processed and published data, constitutes a considerable challenge for a static definition of metadata. Here, we present the concept of dynamic metadata for the neutron and photon scientific community, which enriches a static set of defined basic metadata. We explore the idea of dynamic metadata with the help of the use case of X-ray Photon Correlation Spectroscopy (XPCS), which is a synchrotron-based scattering technique that allows the investigation of nanoscale dynamic processes. It serves here as a demonstrator of how dynamic metadata can improve data acquisition, sharing, and analysis workflows. Our approach enables researchers to tailor metadata definitions dynamically and adapt them to the evolving demands of describing data and results from a diverse set of experiments. We demonstrate that dynamic metadata standards yield advantages that enhance data reproducibility, interoperability, and the dissemination of knowledge.
Keywords: Big data, metadata, schemas, XPCS, X-ray Photon Correlation Spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148794 An Approach to Polynomial Curve Comparison in Geometric Object Database
Authors: Chanon Aphirukmatakun, Natasha Dejdumrong
Abstract:
In image processing and visualization, comparing two bitmapped images needs to be compared from their pixels by matching pixel-by-pixel. Consequently, it takes a lot of computational time while the comparison of two vector-based images is significantly faster. Sometimes these raster graphics images can be approximately converted into the vector-based images by various techniques. After conversion, the problem of comparing two raster graphics images can be reduced to the problem of comparing vector graphics images. Hence, the problem of comparing pixel-by-pixel can be reduced to the problem of polynomial comparisons. In computer aided geometric design (CAGD), the vector graphics images are the composition of curves and surfaces. Curves are defined by a sequence of control points and their polynomials. In this paper, the control points will be considerably used to compare curves. The same curves after relocated or rotated are treated to be equivalent while two curves after different scaled are considered to be similar curves. This paper proposed an algorithm for comparing the polynomial curves by using the control points for equivalence and similarity. In addition, the geometric object-oriented database used to keep the curve information has also been defined in XML format for further used in curve comparisons.Keywords: Bezier curve, Said-Ball curve, Wang-Ball curve, DP curve, CAGD, comparison, geometric object database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218793 Power System Stability Improvement by Simultaneous Tuning of PSS and SVC Based Damping Controllers Employing Differential Evolution Algorithm
Authors: Sangram Keshori Mohapatra, Sidhartha Panda, Prasant Kumar Satpathy
Abstract:
Power-system stability improvement by simultaneous tuning of power system stabilizer (PSS) and a Static Var Compensator (SVC) based damping controller is thoroughly investigated in this paper. Both local and remote signals with associated time delays are considered in the present study. The design problem of the proposed controller is formulated as an optimization problem, and differential evolution (DE) algorithm is employed to search for the optimal controller parameters. The performances of the proposed controllers are evaluated under different disturbances for both single-machine infinite bus power system and multi-machine power system. The performance of the proposed controllers with variations in the signal transmission delays has also been investigated. The proposed stabilizers are tested on a weakly connected power system subjected to different disturbances. Nonlinear simulation results are presented to show the effectiveness and robustness of the proposed control schemes over a wide range of loading conditions and disturbances. Further, the proposed design approach is found to be robust and improves stability effectively even under small disturbance conditions.
Keywords: Differential Evolution Algorithm, Power System Stability, Power System Stabilizer, Static Var Compensator
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2339792 Master in Maritime Logistics: An Industry-Driven Design
Authors: Marco Sernaglia, Augusto M. P. Carreira, Helena M. L. Carvalho, Pedro B. Água, Armindo Frias, Manuel Carrasqueira
Abstract:
The existence of mismatches between the qualification requirements of professionals in the maritime industry and existing higher education offers was verified within the scope of the European project MarLEM (Maritime Logistics Engineering and Management). Professionals in the maritime industry today and in the future face additional obstacles as a result of the sector's global nature as well as the sector's rapid technological and social evolution. As a result, they feel the need to update their skills and knowledge. A professional-oriented master's program was developed to fill this gap. The NOVA School of Science and Technology and the Portuguese Naval School co-developed this Master's program with the active participation of MarLEM project partners from academia and industry. In this work, the principles and approach used to design the master's program are presented. Its design and a concise synopsis of the courses' content are shown. In addition, other international courses covering the same topic are compared. As a result of this work, the teaching materials related to maritime logistics are improved and the assumptions and methodology that guided the creation of an international master's program in maritime logistics are disseminated.
Keywords: Education, maritime logistics, shipping, industrial engineering, management, soft skills.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 507791 Creation of Greater Mekong Subregion Regional Competitiveness through Cluster Mapping
Authors: Danuvasin Charoen
Abstract:
This research investigates cluster development in the area called the Greater Mekong Subregion (GMS), which consists of Thailand, the People’s Republic of China (PRC), the Yunnan Province and Guangxi Zhuang Autonomous Region, Myanmar, the Lao People’s Democratic Republic (Lao PDR), Cambodia, and Vietnam. The study utilized Porter’s competitiveness theory and the cluster mapping approach to analyze the competitiveness of the region. The data collection consists of interviews, focus groups, and the analysis of secondary data. The findings identify some evidence of cluster development in the GMS; however, there is no clear indication of collaboration among the components in the clusters. GMS clusters tend to be stand-alone. The clusters in Vietnam, Lao PDR, Myanmar, and Cambodia tend to be labor intensive, whereas the clusters in Thailand and the PRC (Yunnan) have the potential to successfully develop into innovative clusters. The collaboration and integration among the clusters in the GMS area are promising, though it could take a long time. The most likely relationship between the GMS countries could be, for example, suppliers of the low-end, labor-intensive products will be located in the low income countries such as Myanmar, Lao PDR, and Cambodia, and these countries will be providing input materials for innovative clusters in the middle income countries such as Thailand and the PRC.Keywords: Greater Mekong Subregion, competitiveness, cluster, development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1071790 Analysis on Iranian Wind Catcher and Its Effect on Natural Ventilation as a Solution towards Sustainable Architecture(Case Study: Yazd)
Authors: Mahnaz Mahmoudi Zarandi (Qazvin Islamic Azad University)
Abstract:
wind catchers have been served as a cooling system, used to provide acceptable ventilation by means of renewable energy of wind. In the present study, the city of Yazd in arid climate is selected as case study. From the architecture point of view, learning about wind catchers in this study is done by means of field surveys. Research method for selection of the case is based on random form, and analytical method. Wind catcher typology and knowledge of relationship governing the wind catcher's architecture were those measures that are taken for the first time. 53 wind catchers were analyzed. The typology of the wind-catchers is done by the physical analyzing, patterns and common concepts as incorporated in them. How the architecture of wind catcher can influence their operations by analyzing thermal behavior are the archetypes of selected wind catchers. Calculating fluids dynamics science, fluent software and numerical analysis are used in this study as the most accurate analytical approach. The results obtained from these analyses show the formal specifications of wind catchers with optimum operation in Yazd. The knowledge obtained from the optimum model could be used for design and construction of wind catchers with more improved operation
Keywords: Fluent Software, Iranian architecture, wind catcher
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4493789 A Formal Approach for Proof Constructions in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this article we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (σ-algebras, probability spaces and conditional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes- Formula. Besides, we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this article shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in cryptographic research, if the corresponding basic mathematical knowledge is available in a database.Keywords: prime numbers, primality tests, (conditional) probabilitydistributions, formal proof system, higher-order logic, formalverification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469788 Numerical Analysis of Flow in the Gap between a Simplified Tractor-Trailer Model and Cross Vortex Trap Device
Authors: Terrance Charles, Zhiyin Yang, Yiling Lu
Abstract:
Heavy trucks are aerodynamically inefficient due to their un-streamlined body shapes, leading to more than of 60% engine power being required to overcome the aerodynamics drag at 60 m/hr. There are many aerodynamics drag reduction devices developed and this paper presents a study on a drag reduction device called Cross Vortex Trap Device (CVTD) deployed in the gap between the tractor and the trailer of a simplified tractor-trailer model. Numerical simulations have been carried out at Reynolds number 0.51×106 based on inlet flow velocity and height of the trailer using the Reynolds-Averaged Navier-Stokes (RANS) approach. Three different configurations of CVTD have been studied, ranging from single to three slabs, equally spaced on the front face of the trailer. Flow field around three different configurations of trap device have been analysed and presented. The results show that a maximum of 12.25% drag reduction can be achieved when a triple vortex trap device is used. Detailed flow field analysis along with pressure contours are presented to elucidate the drag reduction mechanisms of CVTD and why the triple vortex trap configuration produces the maximum drag reduction among the three configurations tested.
Keywords: Aerodynamic drag, cross vortex trap device, truck, RANS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 648787 SPA-VNDN: Enhanced Smart Parking Application by Vehicular Named Data Networking
Authors: Bassma Aldahlan, Zongming Fei
Abstract:
Recently, there is a great interest in smart parking application. Theses applications are enhanced by a vehicular ad-hoc network, which helps drivers find and reserve satiable packing spaces for a period of time ahead of time. Named Data Networking (NDN) is a future Internet architecture that benefits vehicular ad-hoc networks because of its clean-slate design and pure communication model. In this paper, we proposed an NDN-based frame-work for smart parking that involved a fog computing architecture. The proposed application had two main directions: First, we allowed drivers to query the number of parking spaces in a particular parking lot. Second, we introduced a technique that enabled drivers to make intelligent reservations before their arrival time. We also introduced a “push-based” model supporting the NDN-based framework for smart parking applications. To evaluate the proposed solution’s performance, we analyzed the function for finding parking lots with available parking spaces and the function for reserving a parking space. Our system showed high performance results in terms of response time and push overhead. The proposed reservation application performed better than the baseline approach.
Keywords: Cloud Computing, Vehicular Named Data Networking, Smart Parking Applications, Fog Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 226786 An Alternative Antimicrobial Approach to Fight Bacterial Pathogens from Phellinus linteus
Authors: S. Techaoei, K. Jarmkom, P. Eakwaropas, W. Khobjai
Abstract:
The objective of this research was focused on investigating in vitro antimicrobial activity of Phellinus linteus fruiting body extracts on Pseudomonas aeruginosa, Escherichia coli, Staphylococcus aureus and Methicillin-resistant Staphylococcus aureus. Phellinus linteus fruiting body was extracted with ethanol and ethyl acetate and was vaporized. The disc diffusion assay was used to assess antimicrobial activity against tested bacterial strains. Primary screening of chemical profile of crude extract was determined by using thin layer chromatography. The positive control and the negative control were used as erythromycin and dimethyl sulfoxide, respectively. Initial screening of Phellinus linteus crude extract with the disc diffusion assay demonstrated that only ethanol had greater antimicrobial activity against Pseudomonas aeruginosa, Escherichia coli, Staphylococcus aureus and Methicillin-resistant Staphylococcus aureus. The MIC assay showed that the lower MIC was observed with 0.5 mg/ml of Pseudomonas aeruginosa and Methicillin-resistant Staphylococcus aureus and 0.25 mg/ml. of Escherichia coli and Staphylococcus aureus, respectively. TLC chemical profile of extract was represented at Rf ≈ 0.71-0.76.
Keywords: Staphylococcus aureus, Phellinus linteus, methicillin-resistant Staphylococcus aureus, antimicrobial activity, Escherichia coli.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1313785 Enhancing Word Meaning Retrieval Using FastText and NLP Techniques
Authors: Sankalp Devanand, Prateek Agasimani, V. S. Shamith, Rohith Neeraje
Abstract:
Machine translation has witnessed significant advancements in recent years, but the translation of languages with distinct linguistic characteristics, such as English and Sanskrit, remains a challenging task. This research presents the development of a dedicated English to Sanskrit machine translation model, aiming to bridge the linguistic and cultural gap between these two languages. Using a variety of natural language processing (NLP) approaches including FastText embeddings, this research proposes a thorough method to improve word meaning retrieval. Data preparation, part-of-speech tagging, dictionary searches, and transliteration are all included in the methodology. The study also addresses the implementation of an interpreter pattern and uses a word similarity task to assess the quality of word embeddings. The experimental outcomes show how the suggested approach may be used to enhance word meaning retrieval tasks with greater efficacy, accuracy, and adaptability. Evaluation of the model's performance is conducted through rigorous testing, comparing its output against existing machine translation systems. The assessment includes quantitative metrics such as BLEU scores, METEOR scores, Jaccard Similarity etc.
Keywords: Machine translation, English to Sanskrit, natural language processing, word meaning retrieval, FastText embeddings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 120784 Capacity Building of Extension Agents for Sustainable Dissemination of Agricultural Information and Technologies in Developing Countries
Authors: Michael T. Ajayi, Oluwakemi E. Fapojuwo
Abstract:
Farmers are in need of regular and relevant information relating to new technologies. Production of extension materials has been found to be useful in facilitating the process. Extension materials help to provide information to reach large numbers of farmers quickly and economically. However, as good as extension materials are, previous materials produced are not used by farmers. The reasons for this include lack of involvement of farmers in the production of the extension materials, most of the extension materials are not relevant to the farmers’ environments, the agricultural extension agents lack capacity to prepare the materials, and many extension agents lack commitment. These problems led to this innovative capacity building of extension agents. This innovative approach involves five stages. The first stage is the diagnostic survey of farmers’ environment to collect useful information. The second stage is the development and production of draft extension materials. The third stage is the field testing and evaluation of draft materials by the same famers that were involved at the diagnostic stage. The fourth stage is the revision of the draft extension materials by incorporating suggestions from farmers. The fifth stage is the action plans. This process improves the capacity of agricultural extension agents in the preparation of extension materials and also promotes engagement of farmers and beneficiaries in the process. The process also makes farmers assume some level of ownership of the exercise and the extension materials.
Keywords: Capacity building, dissemination, extension agents, information/technologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447783 Quality of Life of the Beneficiaries of the Government’s Bolsa Família Program: A Case Study in Mateiros/TO/Brazil
Authors: Mary L. G. S. Senna, Afonso R. Aquino, Veruska C. Dutra, Carlos H. C. Tolentino
Abstract:
The quality of life index, despite elucidating many discussions, the conceptual subjectivity of the term does not show precision, and consequently, many researchers seek to develop methods aiming to measure this concept, bringing it to a more concrete approach. In this study, the quality of life index method was used to analyze the population of Mateiros, Tocantins, Brazil for quality of life. After data collection, it was compared the quality of life index between the population and the group of beneficiaries of the Brazilian government assistance program Bolsa Família (Family Allowance). Some of the people interviewed receive financial aid from the federal government program Bolsa Família (22%). Comparisons were made among the final score of the quality of life index of the Mateiros population and the following factors: Gender, age, education, those working or not with tourism and those who receive or do not receive the Bolsa Família. It was observed that only the factor, Bolsa Família (p-score 0.0138), shows an association with quality of life improvement, noticing that those who have financial aid had a higher quality of life improvement than the rest of the population. It was concluded that, government assistance has shown a decisive element on the enhancement of Mateiros population quality of life, indicating that similar actions should be maintained.Keywords: Quality of life index, government aid to families, sustainable tourism, Bolsa Familia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1794782 Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction
Authors: Jérôme Azé, Mathieu Roche, Yves Kodratoff, Michèle Sebag
Abstract:
Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).Keywords: Text-mining, Terminology Extraction, Evolutionary algorithm, ROC Curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659781 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis
Authors: Young-Seok Choi
Abstract:
This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.
Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061780 Using Scrum in an Online Smart Classroom Environment: A Case Study
Authors: Ye Wei, Sitalakshmi Venkatraman, Fahri Benli, Fiona Wahr
Abstract:
The present digital world poses many challenges to various stakeholders in the education sector. In particular, lecturers of higher education (HE) are faced with the problem of ensuring that students are able to achieve the required learning outcomes despite rapid changes taking place worldwide. Different strategies are adopted to retain student engagement and commitment in classrooms to address the differences in learning habits, preferences and styles of the digital generation of students recently. Further, with the onset of coronavirus disease (COVID-19) pandemic, online classroom has become the most suitable alternate mode of teaching environment to cope with lockdown restrictions. These changes have compounded the problems in the learning engagement and short attention span of HE students. New Agile methodologies that have been successfully employed to manage projects in different fields are gaining prominence in the education domain. In this paper, we present the application of Scrum as an agile methodology to enhance student learning and engagement in an online smart classroom environment. We demonstrate the use of our proposed approach using a case study to teach key topics in information technology that require students to gain technical and business-related data analytics skills.
Keywords: Agile methodology, Scrum, online learning, smart classroom environment, student engagement, active learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 395779 Diagnosing the Cause and its Timing of Changes in Multivariate Process Mean Vector from Quality Control Charts using Artificial Neural Network
Authors: Farzaneh Ahmadzadeh
Abstract:
Quality control charts are very effective in detecting out of control signals but when a control chart signals an out of control condition of the process mean, searching for a special cause in the vicinity of the signal time would not always lead to prompt identification of the source(s) of the out of control condition as the change point in the process parameter(s) is usually different from the signal time. It is very important to manufacturer to determine at what point and which parameters in the past caused the signal. Early warning of process change would expedite the search for the special causes and enhance quality at lower cost. In this paper the quality variables under investigation are assumed to follow a multivariate normal distribution with known means and variance-covariance matrix and the process means after one step change remain at the new level until the special cause is being identified and removed, also it is supposed that only one variable could be changed at the same time. This research applies artificial neural network (ANN) to identify the time the change occurred and the parameter which caused the change or shift. The performance of the approach was assessed through a computer simulation experiment. The results show that neural network performs effectively and equally well for the whole shift magnitude which has been considered.Keywords: Artificial neural network, change point estimation, monte carlo simulation, multivariate exponentially weighted movingaverage
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1377778 A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis
Authors: Chih-Hao Chen, Hsing-Chung Lee, Qingdong Ling, Hsiao-Jung Chen, Sun-Chong Wang, Li-Ching Wu, H.C. Lee
Abstract:
Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 secondsKeywords: Cancer, pathogenesis, chromosomal aberration, copy number variation, segmentation analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477