Search results for: 3D numerical approach
13470 Genetic Algorithms for Parameter Identification of DC Motor ARMAX Model and Optimal Control
Authors: A. Mansouri, F. Krim
Abstract:
This paper presents two techniques for DC motor parameters identification. We propose a numerical method using the adaptive extensive recursive least squares (AERLS) algorithm for real time parameters estimation. This algorithm, based on minimization of quadratic criterion, is realized in simulation for parameters identification of DC motor autoregressive moving average with extra inputs (ARMAX). As advanced technique, we use genetic algorithms (GA) identification with biased estimation for high dynamic performance speed regulation. DC motors are extensively used in variable speed drives, for robot and solar panel trajectory control. GA effectiveness is derived through comparison of the two approaches.Keywords: ARMAX model, DC motor, AERLS, GA, optimization, parameter identification, PID speed regulation
Procedia PDF Downloads 38113469 Recognition of Noisy Words Using the Time Delay Neural Networks Approach
Authors: Khenfer-Koummich Fatima, Mesbahi Larbi, Hendel Fatiha
Abstract:
This paper presents a recognition system for isolated words like robot commands. It’s carried out by Time Delay Neural Networks; TDNN. To teleoperate a robot for specific tasks as turn, close, etc… In industrial environment and taking into account the noise coming from the machine. The choice of TDNN is based on its generalization in terms of accuracy, in more it acts as a filter that allows the passage of certain desirable frequency characteristics of speech; the goal is to determine the parameters of this filter for making an adaptable system to the variability of speech signal and to noise especially, for this the back propagation technique was used in learning phase. The approach was applied on commands pronounced in two languages separately: The French and Arabic. The results for two test bases of 300 spoken words for each one are 87%, 97.6% in neutral environment and 77.67%, 92.67% when the white Gaussian noisy was added with a SNR of 35 dB.Keywords: TDNN, neural networks, noise, speech recognition
Procedia PDF Downloads 28913468 Symo-syl: A Meta-Phonological Intervention to Support Italian Pre-Schoolers’ Emergent Literacy Skills
Authors: Tamara Bastianello, Rachele Ferrari, Marinella Majorano
Abstract:
The adoption of the syllabic approach in preschool programmes could support and reinforce meta-phonological awareness and literacy skills in children. The introduction of a meta-phonological intervention in preschool could facilitate the transition to primary school, especially for children with learning fragilities. In the present contribution, we want to investigate the efficacy of "Simo-syl" intervention in enhancing emergent literacy skills in children (especially for reading). Simo-syl is a 12 weeks multimedia programme developed for children to improve their language and communication skills and later literacy development in preschool. During the intervention, Simo-syl, an invented character, leads children in a series of meta-phonological games. Forty-six Italian preschool children (i.e., the Simo-syl group) participated in the programme; seventeen preschool children (i.e., the control group) did not participate in the intervention. Children in the two groups were between 4;10 and 5;9 years. They were assessed on their vocabulary, morpho-syntactical, meta-phonological, phonological, and phono-articulatory skills twice: 1) at the beginning of the last year of the preschool through standardised paper-based assessment tools and 2) one week after the intervention. All children in the Simo-syl group took part in the meta-phonological programme based on the syllabic approach. The intervention lasted 12 weeks (three activities per week; week 1: activities focused on syllable blending and spelling and a first approach to the written code; weeks 2-11: activities focused on syllables recognition; week 12: activities focused on vowels recognition). Very few children (Simo-syl group = 21, control group = 9) were tested again (post-test) one week after the intervention. Before starting the intervention programme, the Simo-syl and the control groups had similar meta-phonological, phonological, lexical skills (all ps > .05). One week after the intervention, a significant difference emerged between the two groups in their meta-phonological skills (syllable blending, p = .029; syllable spelling, p = .032), in their vowel recognition ability (p = .032) and their word reading skills (p = .05). An ANOVA confirmed the effect of the group membership on the developmental growth for the word reading task (F (1,28) = 6.83, p = .014, ηp2 = .196). Taking part in the Simo-syl intervention has a positive effect on the ability to read in preschool children.Keywords: intervention programme, literacy skills, meta-phonological skills, syllabic approach
Procedia PDF Downloads 16313467 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics
Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris
Abstract:
The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.Keywords: cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization
Procedia PDF Downloads 15613466 Mechanical Characteristics on Fatigue Crack Propagation in Aluminum Plate
Authors: A. Chellil, A. Nour, S. Lecheb , H. Mechakra, L. Addar, H. Kebir
Abstract:
This paper present a mechanical characteristics on fatigue crack propagation in Aluminium Plate based on strain and stress distribution using the abaqus software. The changes in shear strain and stress distribution during the fatigue cycle with crack growth is identified. In progressive crack in the strain distribution and the stress is increase in the critical zone. Numerical Modal analysis of the model developed, prove that the Eigen frequencies of aluminium plate were decreased after cracking, and this reduce is nonlinear. These results can provide a reference for analysts and designers of aluminium alloys in aeronautical systems. Therefore, the modal analysis is an important factor for monitoring the aeronautic structures.Keywords: aluminum alloys, plate, crack, failure
Procedia PDF Downloads 42813465 New Approach for Load Modeling
Authors: Slim Chokri
Abstract:
Load forecasting is one of the central functions in power systems operations. Electricity cannot be stored, which means that for electric utility, the estimate of the future demand is necessary in managing the production and purchasing in an economically reasonable way. A majority of the recently reported approaches are based on neural network. The attraction of the methods lies in the assumption that neural networks are able to learn properties of the load. However, the development of the methods is not finished, and the lack of comparative results on different model variations is a problem. This paper presents a new approach in order to predict the Tunisia daily peak load. The proposed method employs a computational intelligence scheme based on the Fuzzy neural network (FNN) and support vector regression (SVR). Experimental results obtained indicate that our proposed FNN-SVR technique gives significantly good prediction accuracy compared to some classical techniques.Keywords: neural network, load forecasting, fuzzy inference, machine learning, fuzzy modeling and rule extraction, support vector regression
Procedia PDF Downloads 43513464 Impact Assessment of Lean Practices on Social Sustainability Indicators: An Approach Using ISM Method
Authors: Aline F. Marcon, Eduardo F. da Silva, Marina Bouzon
Abstract:
The impact of lean management on environmental sustainability is the research line that receives the most attention from academicians. Therefore, the social dimension of sustainable development has so far received less attention. This paper aims to evaluate the impact of intra-plant lean manufacturing practices on social sustainability indicators extracted from the Global Reporting Initiative (GRI) parameters. The method is two-phased, including MCDM approach to uncover the most relevant practices regarding social performance and Interpretive Structural Modeling (ISM) method to reveal the structural relationship among lean practices. Professionals from the academic and industrial fields answered the questionnaires. From the results of this paper, it is possible to verify that practices such as “Safety Improvement Programs”, “Total Quality Management” and “Cross-functional Workforce” are the ones which have the most positive influence on the set of GRI social indicators.Keywords: indicators, ISM, lean, social, sustainability
Procedia PDF Downloads 14813463 Development of Industry Sector Specific Factory Standards
Authors: Peter Burggräf, Moritz Krunke, Hanno Voet
Abstract:
Due to shortening product and technology lifecycles, many companies use standardization approaches in product development and factory planning to reduce costs and time to market. Unlike large companies, where modular systems are already widely used, small and medium-sized companies often show a much lower degree of standardization due to lower scale effects and missing capacities for the development of these standards. To overcome these challenges, the development of industry sector specific standards in cooperations or by third parties is an interesting approach. This paper analyzes which branches that are mainly dominated by small or medium-sized companies might be especially interesting for the development of factory standards using the example of the German industry. For this, a key performance indicator based approach was developed that will be presented in detail with its specific results for the German industry structure.Keywords: factory planning, factory standards, industry sector specific standardization, production planning
Procedia PDF Downloads 39413462 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach
Authors: Eric Mxolisi Mkhondo
Abstract:
Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism
Procedia PDF Downloads 16513461 An Analytical Method for Bending Rectangular Plates with All Edges Clamped Supported
Authors: Yang Zhong, Heng Liu
Abstract:
The decoupling method and the modified Naiver method are combined for accurate bending analysis of rectangular thick plates with all edges clamped supported. The basic governing equations for Mindlin plates are first decoupled into independent partial differential equations which can be solved separately. Using modified Navier method, the analytic solution of rectangular thick plate with all edges clamped supported is then derived. The solution method used in this paper leave out the complicated derivation for calculating coefficients and obtain the solution to problems directly. Numerical comparisons show the correctness and accuracy of the results at last.Keywords: Mindlin plates, decoupling method, modified Navier method, bending rectangular plates
Procedia PDF Downloads 60013460 Temperature Distribution Enhancement in a Conical Diffuser Fitted with Helical Screw-Tape with and without Center-Rod
Authors: Ehan Sabah Shukri, Wirachman Wisnoe
Abstract:
Temperature distribution investigation in a conical diffuser fitted with helical screw-tape with and without center-rod is studied numerically. A helical screw-tape is inserted in the diffuser to create swirl flow that helps to enhance the temperature distribution rate with inlet Reynolds number 4.3 x 104. Three pitch lengths ratios (Y/L = 0.153, 0.23 and 0.307) for the helical screw-tape with and without center-rod are simulated and compared. The geometry of the conical diffuser and the inlet condition for both arrangements are kept constant. Numerical findings show that the helical screw-tape inserts without center-rod perform significantly better than the helical tape inserts with center-rod in the conical diffuser.Keywords: diffuser, temperature distribution, CFD, pitch ratio
Procedia PDF Downloads 41013459 Numerical Simulation and Laboratory Tests for Rebar Detection in Reinforced Concrete Structures using Ground Penetrating Radar
Authors: Maha Al-Soudani, Gilles Klysz, Jean-Paul Balayssac
Abstract:
The aim of this paper is to use Ground Penetrating Radar (GPR) as a non-destructive testing (NDT) method to increase its accuracy in recognizing the geometric reinforced concrete structures and in particular, the position of steel bars. This definition will help the managers to assess the state of their structures on the one hand vis-a-vis security constraints and secondly to quantify the need for maintenance and repair. Several configurations of acquisition and processing of the simulated signal were tested to propose and develop an appropriate imaging algorithm in the propagation medium to locate accurately the rebar. A subsequent experimental validation was used by testing the imaging algorithm on real reinforced concrete structures. The results indicate that, this algorithm is capable of estimating the reinforcing steel bar position to within (0-1) mm.Keywords: GPR, NDT, Reinforced concrete structures, Rebar location.
Procedia PDF Downloads 50413458 Clustered Regularly Interspaced Short Palindromic Repeats Interference (CRISPRi): An Approach to Inhibit Microbial Biofilm
Authors: Azna Zuberi
Abstract:
Biofilm is a sessile bacterial accretion in which bacteria adapts different physiological and morphological behavior from planktonic form. It is the root cause of about 80% microbial infections in human. Among them, E. coli biofilms are most prevalent in medical devices associated nosocomial infections. The objective of this study was to inhibit biofilm formation by targeting LuxS gene, involved in quorum sensing using CRISPRi. luxS is a synthase, involved in the synthesis of Autoinducer-2(AI-2), which in turn guides the initial stage of biofilm formation. To implement CRISPRi system, we have synthesized complementary sgRNA to target gene sequence and co-expressed with dCas9. Suppression of luxS was confirmed through qRT-PCR. The effect of luxS gene on biofilm inhibition was studied through crystal violet assay, XTT reduction assay and scanning electron microscopy. We conclude that CRISPRi system could be a potential strategy to inhibit bacterial biofilm through mechanism base approach.Keywords: biofilm, CRISPRi, luxS, microbial
Procedia PDF Downloads 18313457 Community Forest Management Practice in Nepal: Public Understanding of Forest Benefit
Authors: Chandralal Shrestha
Abstract:
In the developing countries like Nepal, the community based forest management approach has often been glorified as one of the best forest management alternatives to maximize the forest benefits. Though the approach has succeeded to construct a local level institution and conserve the forest biodiversity, how the local communities perceived about the forest benefits, the question always remains silent among the researchers and policy makers. The paper aims to explore the understanding of forest benefits from the perspective of local communities who used the forests in terms of institutional stability, equity and livelihood opportunity, and ecological stability. The paper revealed that the local communities have mixed understanding over the forest benefits. The institutional and ecological activities carried out by the local communities indicated that they have better understanding over the forest benefits. However, inequality while sharing the forest benefits, low pricing strategy and its negative consequences in valuation of forest products and limited livelihood opportunities indicated the poor understanding.Keywords: community based forest management, forest benefits, lowland, Nepal
Procedia PDF Downloads 31213456 An Algorithm of Set-Based Particle Swarm Optimization with Status Memory for Traveling Salesman Problem
Authors: Takahiro Hino, Michiharu Maeda
Abstract:
Particle swarm optimization (PSO) is an optimization approach that achieves the social model of bird flocking and fish schooling. PSO works in continuous space and can solve continuous optimization problem with high quality. Set-based particle swarm optimization (SPSO) functions in discrete space by using a set. SPSO can solve combinatorial optimization problem with high quality and is successful to apply to the large-scale problem. In this paper, we present an algorithm of SPSO with status memory to decide the position based on the previous position for solving traveling salesman problem (TSP). In order to show the effectiveness of our approach. We examine SPSOSM for TSP compared to the existing algorithms.Keywords: combinatorial optimization problems, particle swarm optimization, set-based particle swarm optimization, traveling salesman problem
Procedia PDF Downloads 55313455 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 16713454 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanismsKeywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 15913453 Looking for a Connection between Oceanic Regions with Trends in Evaporation with Continental Ones with Trends in Precipitation through a Lagrangian Approach
Authors: Raquel Nieto, Marta Vázquez, Anita Drumond, Luis Gimeno
Abstract:
One of the hot spots of climate change is the increment of ocean evaporation. The best estimation of evaporation, OAFlux data, shows strong increasing trends in evaporation from the oceans since 1978, with peaks during the hemispheric winter and strongest along the paths of the global western boundary currents and any inner Seas. The transport of moisture from oceanic sources to the continents is the connection between evaporation from the ocean and precipitation over the continents. A key question is to try to relate evaporative source regions over the oceans where trends have occurred in the last decades with their sinks over the continents to check if there have been also any trends in the precipitation amount or its characteristics. A Lagrangian approach based on FLEXPART and ERA-interim data is used to establish this connection. The analyzed period was 1980 to 2012. Results show that there is not a general pattern, but a significant agreement was found in important areas of climate interest.Keywords: ocean evaporation, Lagrangian approaches, contiental precipitation, Europe
Procedia PDF Downloads 25613452 Proposal of Design Method in the Semi-Acausal System Model
Authors: Shigeyuki Haruyama, Ken Kaminishi, Junji Kaneko, Tadayuki Kyoutani, Siti Ruhana Omar, Oke Oktavianty
Abstract:
This study is used as a definition method to the value and function in manufacturing sector. In concurrence of discussion about present condition of modeling method, until now definition of 1D-CAE is ambiguity and not conceptual. Across all the physics fields, those methods are defined with the formulation of differential algebraic equation which only applied time derivation and simulation. At the same time, we propose semi-acausal modeling concept and differential algebraic equation method as a newly modeling method which the efficiency has been verified through the comparison of numerical analysis result between the semi-acausal modeling calculation and FEM theory calculation.Keywords: system model, physical models, empirical models, conservation law, differential algebraic equation, object-oriented
Procedia PDF Downloads 48513451 Constructivism and Situational Analysis as Background for Researching Complex Phenomena: Example of Inclusion
Authors: Radim Sip, Denisa Denglerova
Abstract:
It’s impossible to capture complex phenomena, such as inclusion, with reductionism. The most common form of reductionism is the objectivist approach, where processes and relationships are reduced to entities and clearly outlined phases, with a consequent search for relationships between them. Constructivism as a paradigm and situational analysis as a methodological research portfolio represent a way to avoid the dominant objectivist approach. They work with a situation, i.e. with the essential blending of actors and their environment. Primary transactions are taking place between actors and their surroundings. Researchers create constructs based on their need to solve a problem. Concepts therefore do not describe reality, but rather a complex of real needs in relation to the available options how such needs can be met. For examination of a complex problem, corresponding methodological tools and overall design of the research are necessary. Using an original research on inclusion in the Czech Republic as an example, this contribution demonstrates that inclusion is not a substance easily described, but rather a relationship field changing its forms in response to its actors’ behaviour and current circumstances. Inclusion consists of dynamic relationship between an ideal, real circumstances and ways to achieve such ideal under the given circumstances. Such achievement has many shapes and thus cannot be captured by description of objects. It can be expressed in relationships in the situation defined by time and space. Situational analysis offers tools to examine such phenomena. It understands a situation as a complex of dynamically changing aspects and prefers relationships and positions in the given situation over a clear and final definition of actors, entities, etc. Situational analysis assumes creation of constructs as a tool for solving a problem at hand. It emphasizes the meanings that arise in the process of coordinating human actions, and the discourses through which these meanings are negotiated. Finally, it offers “cartographic tools” (situational maps, socials worlds / arenas maps, positional maps) that are able to capture the complexity in other than linear-analytical ways. This approach allows for inclusion to be described as a complex of phenomena taking place with a certain historical preference, a complex that can be overlooked if analyzed with a more traditional approach.Keywords: constructivism, situational analysis, objective realism, reductionism, inclusion
Procedia PDF Downloads 14913450 Research on the Conservation Strategy of Territorial Landscape Based on Characteristics: The Case of Fujian, China
Authors: Tingting Huang, Sha Li, Geoffrey Griffiths, Martin Lukac, Jianning Zhu
Abstract:
Territorial landscapes have experienced a gradual loss of their typical characteristics during long-term human activities. In order to protect the integrity of regional landscapes, it is necessary to characterize, evaluate and protect them in a graded manner. The study takes Fujian, China, as an example and classifies the landscape characters of the site at the regional scale, middle scale, and detailed scale. A multi-scale approach combining parametric and holistic approaches is used to classify and partition the landscape character types (LCTs) and landscape character areas (LCAs) at different scales, and a multi-element landscape assessment approach is adopted to explore the conservation strategies of the landscape character. Firstly, multiple fields and multiple elements of geography, nature and humanities were selected as the basis of assessment according to the scales. Secondly, the study takes a parametric approach to the classification and partitioning of landscape character, Principal Component Analysis, and two-stage cluster analysis (K-means and GMM) in MATLAB software to obtain LCTs, combines with Canny Operator Edge Detection Algorithm to obtain landscape character contours and corrects LCTs and LCAs by field survey and manual identification methods. Finally, the study adopts the Landscape Sensitivity Assessment method to perform landscape character conservation analysis and formulates five strategies for different LCAs: conservation, enhancement, restoration, creation, and combination. This multi-scale identification approach can efficiently integrate multiple types of landscape character elements, reduce the difficulty of broad-scale operations in the process of landscape character conservation, and provide a basis for landscape character conservation strategies. Based on the natural background and the restoration of regional characteristics, the results of landscape character assessment are scientific and objective and can provide a strong reference in regional and national scale territorial spatial planning.Keywords: parameterization, multi-scale, landscape character identify, landscape character assessment
Procedia PDF Downloads 9913449 Virtualization of Biomass Colonization: Potential of Application in Precision Medicine
Authors: Maria Valeria De Bonis, Gianpaolo Ruocco
Abstract:
Nowadays, computational modeling is paving new design and verification ways in a number of industrial sectors. The technology is ripe to challenge some case in the Bioengineering and Medicine frameworks: for example, looking at the strategical and ethical importance of oncology research, efforts should be made to yield new and powerful resources to tumor knowledge and understanding. With these driving motivations, we approach this gigantic problem by using some standard engineering tools such as the mathematics behind the biomass transfer. We present here some bacterial colonization studies in complex structures. As strong analogies hold with some tumor proliferation, we extend our study to a benchmark case of solid tumor. By means of a commercial software, we model biomass and energy evolution in arbitrary media. The approach will be useful to cast virtualization cases of cancer growth in human organs, while augmented reality tools will be used to yield for a realistic aid to informed decision in treatment and surgery.Keywords: bacteria, simulation, tumor, precision medicine
Procedia PDF Downloads 33513448 Learning Dynamic Representations of Nodes in Temporally Variant Graphs
Authors: Sandra Mitrovic, Gaurav Singh
Abstract:
In many industries, including telecommunications, churn prediction has been a topic of active research. A lot of attention has been drawn on devising the most informative features, and this area of research has gained even more focus with spread of (social) network analytics. The call detail records (CDRs) have been used to construct customer networks and extract potentially useful features. However, to the best of our knowledge, no studies including network features have yet proposed a generic way of representing network information. Instead, ad-hoc and dataset dependent solutions have been suggested. In this work, we build upon a recently presented method (node2vec) to obtain representations for nodes in observed network. The proposed approach is generic and applicable to any network and domain. Unlike node2vec, which assumes a static network, we consider a dynamic and time-evolving network. To account for this, we propose an approach that constructs the feature representation of each node by generating its node2vec representations at different timestamps, concatenating them and finally compressing using an auto-encoder-like method in order to retain reasonably long and informative feature vectors. We test the proposed method on churn prediction task in telco domain. To predict churners at timestamp ts+1, we construct training and testing datasets consisting of feature vectors from time intervals [t1, ts-1] and [t2, ts] respectively, and use traditional supervised classification models like SVM and Logistic Regression. Observed results show the effectiveness of proposed approach as compared to ad-hoc feature selection based approaches and static node2vec.Keywords: churn prediction, dynamic networks, node2vec, auto-encoders
Procedia PDF Downloads 31513447 Production of New Hadron States in Effective Field Theory
Authors: Qi Wu, Dian-Yong Chen, Feng-Kun Guo, Gang Li
Abstract:
In the past decade, a growing number of new hadron states have been observed, which are dubbed as XYZ states in the heavy quarkonium mass regions. In this work, we present our study on the production of some new hadron states. In particular, we investigate the processes Υ(5S,6S)→ Zb (10610)/Zb (10650)π, Bc→ Zc (3900)/Zc (4020)π and Λb→ Pc (4312)/Pc (4440)/Pc (4457)K. (1) For the production of Zb (10610)/Zb (10650) from Υ(5S,6S) decay, two types of bottom-meson loops were discussed within a nonrelativistic effective field theory. We found that the loop contributions with all intermediate states being the S-wave ground state bottom mesons are negligible, while the loops with one bottom meson being the broad B₀* or B₁' resonance could provide the dominant contributions to the Υ(5S)→ Zb⁽'⁾ π. (2) For the production of Zc (3900)/Zc (4020) from Bc decay, the branching ratios of Bc⁺→ Z (3900)⁺ π⁰ and Bc⁺→ Zc (4020)⁺ π⁰ are estimated to be of order of 10⁽⁻⁴⁾ and 10⁽⁻⁷⁾ in an effective Lagrangian approach. The large production rate of Zc (3900) could provide an important source of the production of Zc (3900) from the semi-exclusive decay of b-flavored hadrons reported by D0 Collaboration, which can be tested by the exclusive measurements in LHCb. (3) For the production of Pc (4312), Pc (4440) and Pc (4457) from Λb decay, the ratio of the branching fraction of Λb→ Pc K was predicted in a molecular scenario by using an effective Lagrangian approach, which is weakly dependent on our model parameter. We also find the ratios of the productions of the branching fractions of Λb→ Pc K and Pc→ J/ψ p can be well interpreted in the molecular scenario. Moreover, the estimated branching fractions of Λb→ Pc K are of order 10⁽⁻⁶⁾, which could be tested by further measurements in LHCb Collaboration.Keywords: effective Lagrangian approach, hadron loops, molecular states, new hadron states
Procedia PDF Downloads 13213446 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models
Authors: Tony Mann
Abstract:
This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.Keywords: facilitation, stakeholders, buy-in, digital workshops
Procedia PDF Downloads 11013445 Optimum Stratification of a Skewed Population
Authors: D. K. Rao, M. G. M. Khan, K. G. Reddy
Abstract:
The focus of this paper is to develop a technique of solving a combined problem of determining Optimum Strata Boundaries (OSB) and Optimum Sample Size (OSS) of each stratum, when the population understudy is skewed and the study variable has a Pareto frequency distribution. The problem of determining the OSB is formulated as a Mathematical Programming Problem (MPP) which is then solved by dynamic programming technique. A numerical example is presented to illustrate the computational details of the proposed method. The proposed technique is useful to obtain OSB and OSS for a Pareto type skewed population, which minimizes the variance of the estimate of population mean.Keywords: stratified sampling, optimum strata boundaries, optimum sample size, pareto distribution, mathematical programming problem, dynamic programming technique
Procedia PDF Downloads 45513444 In-service High School Teachers’ Experiences On Blended Teaching Approach Of Mathematics
Authors: Lukholo Raxangana
Abstract:
Fourth Industrial Revolution (4IR)-era teaching offers in-service mathematics teachers opportunities to use blended approaches to engage learners while teaching mathematics. This study explores in-service high school teachers' experiences with a blended teaching approach to mathematics. This qualitative case study involved eight pre-service teachers from four selected schools in the Sedibeng West District of the Gauteng Province. The study used the community of inquiry model as its analytical framework for data analysis. Data collection was through semi-structured interviews and focus-group discussions to explore in-service teachers' experiences with the influence of blended teaching (BT) on learning mathematics. The study results are the impact of load-shedding, benefits of BT, and perceptions of in-service and hindrances of BT. Based on these findings, the study recommends that further research should focus on developing data-free BT tools to assist during load-shedding, regardless of location.Keywords: bended teaching, teachers, in-service, and mathematics
Procedia PDF Downloads 5813443 Influence Analysis of Pelamis Wave Energy Converter Structure Parameters
Authors: Liu Shengnan, Sun Liping, Zhu Jianxun
Abstract:
Based on three dimensional potential flow theory and hinged rigid body motion equations, structure RAOs of Pelamis wave energy converter is analyzed. Analysis of numerical simulation is carried out on Pelamis in the irregular wave conditions, and the motion response of structures and total generated power is obtained. The paper analyzes influencing factors on the average power including diameter of floating body, section form of floating body, draft, hinged stiffness and damping. The optimum parameters are achieved in Zhejiang Province. Compared with the results of the pelamis experiment made by Glasgow University, the method applied in this paper is feasible.Keywords: Pelamis, hinge, floating multibody, wave energy
Procedia PDF Downloads 46513442 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt
Authors: Sahbi Farhani
Abstract:
This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.Keywords: external debt, military spending, ARDL approach, India
Procedia PDF Downloads 29613441 An Investigation into the Current Implementation of Design-Build Contracts in the Kingdom of Saudi Arabia
Authors: Ibrahim A. Alhammad, Suleiman A. Al-Otaibi, Khalid S. Al-Gahtani, Naïf Al-Otaibi, Abdulaziz A. Bubshait
Abstract:
In the last decade, the use of project delivery system of design build engineering contracts is increasing in North America due to the reasons of reducing the project duration and minimizing costs. The shift from traditional approach of Design-Bid-Build to Design-Build contracts have been attributed to many factors such as evolution of the regulatory and legal frameworks governing the engineering contracts and improvement in integrating design and construction. The aforementioned practice of contracting is more appropriate in North America; yet, it may not be the case in Saudi Arabia where the traditional approach of construction contracting remains dominant. The authors believe there are number of factors related to the gaps in the level of sophistication of the engineering and management of the construction projects in both countries. A step towards improving the Saudi construction practice by adopting the new trend of construction contracting, this paper identifies the reasons why Design/Build form of contracting are not frequently utilized. A field survey, which includes the questionnaire addressing the research problem, is distributed to three main parties of the construction contracts: clients, consultants, and contractors. The analyzed collected data were statistically sufficient to finding the reasons of not adopting the new trend of good practice of deign build approach in Saudi Arabia. In addition, the reasons are: (1) lack of regulation and legal framework; (2) absence of clear criteria of the owner for the trade-off between competing contractors, (3) and lack of experience, knowledge and skill.Keywords: design built projects, Saudi Arabia, GCC, mega projects
Procedia PDF Downloads 220