Search results for: intuitionistic fuzzy entropy measure
2883 Inclusive Cities Decision Matrix Based on a Multidimensional Approach for Sustainable Smart Cities
Authors: Madhurima S. Waghmare, Shaleen Singhal
Abstract:
The concept of smartness, inclusion, sustainability is multidisciplinary and fuzzy, rooted in economic and social development theories and policies which get reflected in the spatial development of the cities. It is a challenge to convert these concepts from aspirations to transforming actions. There is a dearth of assessment and planning tools to support the city planners and administrators in developing smart, inclusive, and sustainable cities. To address this gap, this study develops an inclusive cities decision matrix based on an exploratory approach and using mixed methods. The matrix is soundly based on a review of multidisciplinary urban sector literature and refined and finalized based on inputs from experts and insights from case studies. The application of the decision matric on the case study cities in India suggests that the contemporary planning tools for cities need to be multidisciplinary and flexible to respond to the unique needs of the diverse contexts. The paper suggests that a multidimensional and inclusive approach to city planning can play an important role in building sustainable smart cities.Keywords: inclusive-cities decision matrix, smart cities in India, city planning tools, sustainable cities
Procedia PDF Downloads 1562882 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA
Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell
Abstract:
Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis
Procedia PDF Downloads 2302881 Performance Analysis of Permanent Magnet Synchronous Motor Using Direct Torque Control Based ANFIS Controller for Electric Vehicle
Authors: Marulasiddappa H. B., Pushparajesh Viswanathan
Abstract:
Day by day, the uses of internal combustion engines (ICE) are deteriorating because of pollution and less fuel availability. In the present scenario, the electric vehicle (EV) plays a major role in the place of an ICE vehicle. The performance of EVs can be improved by the proper selection of electric motors. Initially, EV preferred induction motors for traction purposes, but due to complexity in controlling induction motor, permanent magnet synchronous motor (PMSM) is replacing induction motor in EV due to its advantages. Direct torque control (DTC) is one of the known techniques for PMSM drive in EV to control the torque and speed. However, the presence of torque ripple is the main drawback of this technique. Many control strategies are followed to reduce the torque ripples in PMSM. In this paper, the adaptive neuro-fuzzy inference system (ANFIS) controller technique is proposed to reduce torque ripples and settling time. Here the performance parameters like torque, speed and settling time are compared between conventional proportional-integral (PI) controller with ANFIS controller.Keywords: direct torque control, electric vehicle, torque ripple, PMSM
Procedia PDF Downloads 1642880 Optics Meets Microfluidics for Highly Sensitive Force Sensing
Authors: Iliya Dimitrov Stoev, Benjamin Seelbinder, Elena Erben, Nicola Maghelli, Moritz Kreysing
Abstract:
Despite the revolutionizing impact of optical tweezers in materials science and cell biology up to the present date, trapping has so far extensively relied on specific material properties of the probe and local heating has limited applications related to investigating dynamic processes within living systems. To overcome these limitations while maintaining high sensitivity, here we present a new optofluidic approach that can be used to gently trap microscopic particles and measure femtoNewton forces in a contact-free manner and with thermally limited precision.Keywords: optofluidics, force measurements, microrheology, FLUCS, thermoviscous flows
Procedia PDF Downloads 1702879 Reliability-Based Life-Cycle Cost Model for Engineering Systems
Authors: Reza Lotfalian, Sudarshan Martins, Peter Radziszewski
Abstract:
The effect of reliability on life-cycle cost, including initial and maintenance cost of a system is studied. The failure probability of a component is used to calculate the average maintenance cost during the operation cycle of the component. The standard deviation of the life-cycle cost is also calculated as an error measure for the average life-cycle cost. As a numerical example, the model is used to study the average life cycle cost of an electric motor.Keywords: initial cost, life-cycle cost, maintenance cost, reliability
Procedia PDF Downloads 6052878 BIASS in the Estimation of Covariance Matrices and Optimality Criteria
Authors: Juan M. Rodriguez-Diaz
Abstract:
The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix
Procedia PDF Downloads 4432877 Democracy as a Curve: A Study on How Democratization Impacts Economic Growth
Authors: Henrique Alpalhão
Abstract:
This paper attempts to model the widely studied relationship between a country's economic growth and its level of democracy, with an emphasis on possible non-linearities. We adopt the concept of 'political capital' as a measure of democracy, which is extremely uncommon in the literature and brings considerable advantages both in terms of dynamic considerations and plausibility. While the literature is not consensual on this matter, we obtain, via panel Arellano-Bond regression analysis on a database of more than 60 countries over 50 years, significant and robust results that indicate that the impact of democratization on economic growth varies according to the stage of democratic development each country is in.Keywords: democracy, economic growth, political capital, political economy
Procedia PDF Downloads 3212876 Methodologies for Management of Sustainable Tourism: A Case Study in Jalapão/to/Brazil
Authors: Mary L. G. S. Senna, Veruska C. Dutra, Afonso R. Aquino
Abstract:
The study is in application and analysis of two tourism management tools that can contribute to making public managers decision: the Barometer of Tourism Sustainability (BTS) and the Ecological Footprint (EF). The results have shown that BTS allows you to have an integrated view of the tourism system, awakening to the need for planning of appropriate actions so that it can achieve the positive scale proposed (potentially sustainable). Already the methodology of ecological tourism footprint is an important tool to measure potential impacts generated by tourism to tourist reality.Keywords: barometer of tourism sustainability, ecological footprint of tourism, Jalapão/Brazil, sustainable tourism
Procedia PDF Downloads 5032875 The Effect of Santolina Plant Extract on Nitro-Oxidative Stress
Authors: Sabrina Sebbane, Alina Elena Parvu
Abstract:
Introduction: Santolina rosmarinifolia is a plant of the Santolina genus, a family made of medicinal plants widely used. Some of the Santolina species have been proven to have potent anti-inflammatory and anti-oxidant effects. However, no in vivo study has been made to demonstrate this in Santolina rosmarinifolia. The aim of our study is to experimentally evaluate the potential anti-inflammatory and anti-oxidant effects of Santolina rosmarinifolia plant extracts on acute inflammation in rats. These effects are defined by measuring the modifications on nitric oxide, reactive oxygen species and anti-oxidant response in serum. Materials and Methods: Rats were divided into 5 groups (n=6). Three groups were given Santolina rosmarinifolia extract by gavage in different concentrations(100%, 50%, 25%) for a week. Inflammation was induced by i.m injection of turpentine oil on the 8th day. One group was only given turpentine oil and the fifth group acted as control and was given only saline solution. Blood was collected and serum separated. Global tests were used to measure the oxidative stress, total oxidative status (TOS), total antioxidant reactivity (TAR) and the modified method of Griess assay to measure NO synthesis. Malondilaldehyde (MDA) and thiols levels were also assessed. Results: Santolina rosmarinifolia did not significantly change the TOS levels (p > 0.05). Santolina rosmarinifolia 25% and 50% decreased significantly the TAR levels (p < 0.001). Santolina 100% didn't have a significant effect on TAR (p > 0.05). All concentrations of Santolina rosmarinifolia increased the oxidative stress index (OSI) significantly(p < 0.05). Santolina rosmarinifolia 100% significantly decreased NO synthesis (p value < 0.05). In the diluted Santolina groups, no significant effect on NO synthesis was observed. In the groups treated with Santolina 100% and Santolina rosmarinifolia 50%, thiols concentration were significantly higher compared to the inflammation group (p < 0.02). A higher stimulatory effect was found in the Santolina 25% group (p value < 0.05). MDA levels were not significantly modified by the administration of Santolina rosmarinifolia (p > 0.05). Conclusion: All three solutions of Santolina rosmarinifolia had no important effect on oxidant production. However, Santolina rosmarinifolia solutions had a positive effect by increasing the thiols concentration in the serum of the models. The sum of all the effects produced by the administration of Santolina did not show a significant decrease of nitro-oxidative stress. Further experiments including smaller concentrations of Santolina rosmarinifolia will be made. Santolina rosmarinifolia should also be tested as a curative treatment.Keywords: inflammation, MDA, nitric oxide, santolina rosmarinifolia, thiols, TAR, TOS
Procedia PDF Downloads 2602874 An Evaluative Approach for Successful Implementation of Lean and Green Manufacturing in Indian SMEs
Authors: Satya S. N. Narayana, P. Parthiban, T. Niranjan, N. Kannan
Abstract:
Enterprises adopt methodologies to increase their business performance and to stay competent in the volatile global market. Lean manufacturing is one such manufacturing paradigm which focuses on reduction of cost by elimination of wastes or non-value added activities. With increased awareness about social responsibility and the necessary to meet the terms of the environmental policy, green manufacturing is becoming increasingly important for industries. Large plants have more resources, have started implementing lean and green practices and they are getting good results. Small and medium scale enterprises (SMEs) are facing problems in implementing lean and green concept. This paper aims to identify the key issues for implementation of lean and green concept in Indian SMEs. The key factors identified based on literature review and expert opinions are grouped into different levels by Modified Interpretive Structural Modeling (MISM) to explore the importance among the factors to implement lean and green manufacturing. Finally, Fuzzy Analytic Network Process (FANP) method has been used to determine the extent to which the main principles of lean and green manufacturing have been carried out in the six Indian medium scale manufacturing industries.Keywords: lean manufacturing, green manufacturing, MISM, FANP
Procedia PDF Downloads 5422873 Assessing the Effect of Waste-based Geopolymer on Asphalt Binders
Authors: Amani A. Saleh, Maram M. Saudy, Mohamed N. AbouZeid
Abstract:
Asphalt cement concrete is a very commonly used material in the construction of roads. It has many advantages, such as being easy to use as well as providing high user satisfaction in terms of comfortability and safety on the road. However, there are some problems that come with asphalt cement concrete, such as its high carbon footprint, which makes it environmentally unfriendly. In addition, pavements require frequent maintenance, which could be very costly and uneconomic. The aim of this research is to study the effect of mixing waste-based geopolymers with asphalt binders. Geopolymer mixes were prepared by combining alumino-silicate sources such as fly ash, silica fumes, and metakaolin with alkali activators. The purpose of mixing geopolymers with the asphalt binder is to enhance the rheological and microstructural properties of asphalt. This was done through two phases, where the first phase was developing an optimum mix design of the geopolymer additive itself. The following phase was testing the geopolymer-modified asphalt binder after the addition of the optimum geopolymer mix design to it. The testing of the modified binder is performed according to the Superpave testing procedures, which include the dynamic shear rheometer to measure parameters such as rutting and fatigue cracking, and the rotational viscometer to measure workability. In addition, the microstructural properties of the modified binder is studied using the environmental scanning electron microscopy test (ESEM). In the testing phase, the aim is to observe whether the addition of different geopolymer percentages to the asphalt binder will enhance the properties of the binder and yield desirable results. Furthermore, the tests on the geopolymer-modified binder were carried out at fixed time intervals, therefore, the curing time was the main parameter being tested in this research. It was observed that the addition of geopolymers to asphalt binder has shown an increased performance of asphalt binder with time. It is worth mentioning that carbon emissions are expected to be reduced since geopolymers are environmentally friendly materials that minimize carbon emissions and lead to a more sustainable environment. Additionally, the use of industrial by-products such as fly ash and silica fumes is beneficial in the sense that they are recycled into producing geopolymers instead of being accumulated in landfills and therefore wasting space.Keywords: geopolymer, rutting, superpave, fatigue cracking, sustainability, waste
Procedia PDF Downloads 1282872 Optical Coherence Tomography in Parkinson’s Disease: A Potential in-vivo Retinal α-Synuclein Biomarker in Parkinson’s Disease
Authors: Jessica Chorostecki, Aashka Shah, Fen Bao, Ginny Bao, Edwin George, Navid Seraji-Bozorgzad, Veronica Gorden, Christina Caon, Elliot Frohman
Abstract:
Background: Parkinson’s Disease (PD) is a neuro degenerative disorder associated with the loss of dopaminergic cells and the presence α-synuclein (AS) aggregation in of Lewy bodies. Both dopaminergic cells and AS are found in the retina. Optical coherence tomography (OCT) allows high-resolution in-vivo examination of retinal structure injury in neuro degenerative disorders including PD. Methods: We performed a cross-section OCT study in patients with definite PD and healthy controls (HC) using Spectral Domain SD-OCT platform to measure the peripapillary retinal nerve fiber layer (pRNFL) thickness and total macular volume (TMV). We performed intra-retinal segmentation with fully automated segmentation software to measure the volume of the RNFL, ganglion cell layer (GCL), inner plexiform layer (IPL), inner nuclear layer (INL), outer plexiform layer (OPL), and the outer nuclear layer (ONL). Segmentation was performed blinded to the clinical status of the study participants. Results: 101 eyes from 52 PD patients (mean age 65.8 years) and 46 eyes from 24 HC subjects (mean age 64.1 years) were included in the study. The mean pRNFL thickness was not significantly different (96.95 μm vs 94.42 μm, p=0.07) but the TMV was significantly lower in PD compared to HC (8.33 mm3 vs 8.58 mm3 p=0.0002). Intra-retinal segmentation showed no significant difference in the RNFL volume between the PD and HC groups (0.95 mm3 vs 0.92 mm3 p=0.454). However, GCL, IPL, INL, and ONL volumes were significantly reduced in PD compared to HC. In contrast, the volume of OPL was significantly increased in PD compared to HC. Conclusions: Our finding of the enlarged OPL corresponds with mRNA expression studies showing localization of AS in the OPL across vertebrate species and autopsy studies demonstrating AS aggregation in the deeper layers of retina in PD. We propose that the enlargement of the OPL may represent a potential biomarker of AS aggregation in PD. Longitudinal studies in larger cohorts are warranted to confirm our observations that may have significant implications in disease monitoring and therapeutic development.Keywords: Optical Coherence Tomography, biomarker, Parkinson's disease, alpha-synuclein, retina
Procedia PDF Downloads 4372871 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo
Authors: Margaret Boone Rappaport, Christopher J. Corbally
Abstract:
The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.Keywords: genetic drift, genomics, parietal expansion, religious capacity
Procedia PDF Downloads 3412870 Characterization and Calibration of a Fluxgate Magnetometer Sensor 539
Authors: Luz Yoali Alfaro Luna, Angélica Hernández Rayas, Teodoro Córdova Fraga
Abstract:
This work characterizes and calibrates a fluxgate 539 magnetometer sensor, implementing a real-time monitoring interface to measure magnetic fields with high precision. The objective is to develop an innovative prototype integrating the Fluxgate 539 sensor, a WX-DC2412 power supply, and an Arduino UNO. Methods include interface programming and data conversion to Gauss units. The results show accurate measurements after calibrating the sensor, establishing a foundation for further research in magnetobiology.Keywords: calibration, fluxgate 539, magnetobiology, magnetic field measurement, monitoring interface, sensor characterization
Procedia PDF Downloads 142869 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran
Authors: Rojin Bana Derakhshan, Abbas Toloie
Abstract:
For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.Keywords: energy saving, key elements of success, optimization of energy consumption, data mining
Procedia PDF Downloads 4682868 An Approach on the Design of a Solar Cell Characterization Device
Authors: Christoph Mayer, Dominik Holzmann
Abstract:
This paper presents the development of a compact, portable and easy to handle solar cell characterization device. The presented device reduces the effort and cost of single solar cell characterization to a minimum. It enables realistic characterization of cells under sunlight within minutes. In the field of photovoltaic research the common way to characterize a single solar cell or a module is, to measure the current voltage curve. With this characteristic the performance and the degradation rate can be defined which are important for the consumer or developer. The paper consists of the system design description, a summary of the measurement results and an outline for further developments.Keywords: solar cell, photovoltaics, PV, characterization
Procedia PDF Downloads 4212867 Quantum Coherence Sets the Quantum Speed Limit for Mixed States
Authors: Debasis Mondal, Chandan Datta, S. K. Sazim
Abstract:
Quantum coherence is a key resource like entanglement and discord in quantum information theory. Wigner- Yanase skew information, which was shown to be the quantum part of the uncertainty, has recently been projected as an observable measure of quantum coherence. On the other hand, the quantum speed limit has been established as an important notion for developing the ultra-speed quantum computer and communication channel. Here, we show that both of these quantities are related. Thus, cast coherence as a resource to control the speed of quantum communication. In this work, we address three basic and fundamental questions. There have been rigorous attempts to achieve more and tighter evolution time bounds and to generalize them for mixed states. However, we are yet to know (i) what is the ultimate limit of quantum speed? (ii) Can we measure this speed of quantum evolution in the interferometry by measuring a physically realizable quantity? Most of the bounds in the literature are either not measurable in the interference experiments or not tight enough. As a result, cannot be effectively used in the experiments on quantum metrology, quantum thermodynamics, and quantum communication and especially in Unruh effect detection et cetera, where a small fluctuation in a parameter is needed to be detected. Therefore, a search for the tightest yet experimentally realisable bound is a need of the hour. It will be much more interesting if one can relate various properties of the states or operations, such as coherence, asymmetry, dimension, quantum correlations et cetera and QSL. Although, these understandings may help us to control and manipulate the speed of communication, apart from the particular cases like the Josephson junction and multipartite scenario, there has been a little advancement in this direction. Therefore, the third question we ask: (iii) Can we relate such quantities with QSL? In this paper, we address these fundamental questions and show that quantum coherence or asymmetry plays an important role in setting the QSL. An important question in the study of quantum speed limit may be how it behaves under classical mixing and partial elimination of states. This is because this may help us to choose properly a state or evolution operator to control the speed limit. In this paper, we try to address this question and show that the product of the time bound of the evolution and the quantum part of the uncertainty in energy or quantum coherence or asymmetry of the state with respect to the evolution operator decreases under classical mixing and partial elimination of states.Keywords: completely positive trace preserving maps, quantum coherence, quantum speed limit, Wigner-Yanase Skew information
Procedia PDF Downloads 3532866 Impact of Cultural Intelligence on Decision Making Styles of Managers: A Turkish Case
Authors: Fusun Akdag
Abstract:
Today, as business becomes increasingly global, managers/leaders of multinational companies or local companies work with employees or customers from a variety of cultural backgrounds. To do this effectively, they need to develop cultural competence. Therefore, cultural intelligence (CQ) becomes a vitally important aptitude and skill, especially for leaders. The organizational success or failure depends upon the way, the kind of leadership which has been provided to its members. The culture we are born into deeply effects our values, beliefs, and behavior. Cultural intelligence (CQ) focuses on how well individuals can relate and work across cultures. CQ helps minimize conflict and maximize performance of a diverse workforce. The term 'decision,' refers to a commitment to a course of action that is intended to serve the interests and values of particular people. One dimension of culture that has received attention is individualism-collectivism or, independence-interdependence. These dimensions are associated with different conceptualizations of the 'self.' Individualistic cultures tend to value personal goal pursuit as opposed to pursuit of others’ goals. Collectivistic cultures, by contrast, view the 'self' as part of a whole. Each person is expected to work with his or her in-group toward goals, generally pursue group harmony. These differences underlie cross-cultural variation in decision-making, such as the decision modes people use, their preferences, negotiation styles, creativity, and more. The aim of this study is determining the effect of CQ on decision making styles of male and female managers in Turkey, an emergent economy framework. The survey is distributed to gather data from managers at various companies. The questionnaire consists of three parts: demographics, The Cultural Intelligence Scale (CQS) to measure the four dimensions of cultural intelligence and General Decision Making Style (GMDS) Inventory to measure the five subscales of decision making. The results will indicate the Turkish managers’ score at metacognitive, cognitive, motivational and behavioral aspects of cultural intelligence and to what extent these scores affect their rational, avoidant, dependent, intuitive and spontaneous decision making styles since business leaders make dozens of decisions every day that influence the success of the company and also having an impact on employees, customers, shareholders and the market.Keywords: cultural intelligence, decision making, gender differences, management styles,
Procedia PDF Downloads 3702865 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation
Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell
Abstract:
Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models
Procedia PDF Downloads 1462864 VaR or TCE: Explaining the Preferences of Regulators
Authors: Silvia Faroni, Olivier Le Courtois, Krzysztof Ostaszewski
Abstract:
While a lot of research concentrates on the merits of VaR and TCE, which are the two most classic risk indicators used by financial institutions, little has been written on explaining why regulators favor the choice of VaR or TCE in their set of rules. In this paper, we investigate the preferences of regulators with the aim of understanding why, for instance, a VaR with a given confidence level is ultimately retained. Further, this paper provides equivalence rules that explain how a given choice of VaR can be equivalent to a given choice of TCE. Then, we introduce a new risk indicator that extends TCE by providing a more versatile weighting of the constituents of probability distribution tails. All of our results are illustrated using the generalized Pareto distribution.Keywords: generalized pareto distribution, generalized tail conditional expectation, regulator preferences, risk measure
Procedia PDF Downloads 1702863 Improving the Design of Blood Pressure and Blood Saturation Monitors
Authors: L. Parisi
Abstract:
A blood pressure monitor or sphygmomanometer can be either manual or automatic, employing respectively either the auscultatory method or the oscillometric method. The manual version of the sphygmomanometer involves an inflatable cuff with a stethoscope adopted to detect the sounds generated by the arterial walls to measure blood pressure in an artery. An automatic sphygmomanometer can be effectively used to monitor blood pressure through a pressure sensor, which detects vibrations provoked by oscillations of the arterial walls. The pressure sensor implemented in this device improves the accuracy of the measurements taken.Keywords: blood pressure, blood saturation, sensors, actuators, design improvement
Procedia PDF Downloads 4552862 The Impact of Animal Assisted Interventions in Primary Schools: A Mixed Method Intervention Study Examining the Influence of Reading to Dogs on Children's Reading Outcomes and Emotional Wellbeing
Authors: Jill Steel
Abstract:
The interlinked issues of emotional wellbeing and attainment continue to dominate international educational discourse. Reading skills are particularly important to attainment in all areas of the curriculum, and illiteracy is associated with reduced wellbeing and life prospects, with serious ramifications for the wider economy and society. Research shows that reading attainment is influenced by reading motivation and frequency. Reading to Dogs (RTD) is increasingly applied to promote reading motivation and frequency in schools despite a paucity of empirical evidence, specifically examining the influence of RTD on emotional wellbeing and engagement with reading. This research aims to examine whether RTD is effective in promoting these positive outcomes among children aged eight to nine years. This study also aims to inform much needed regulation of the field and standards of practice, including both child and dog welfare. Therefore, ethical matters such as children’s inclusion and safety, as well as the rights and wellbeing of dogs infuse the study throughout. The methodological design is a mixed method longitudinal study. A UK wide questionnaire will be distributed to teachers between January and June 2020 to understand their perceptions of RTD. Following this, a randomised controlled trial (N = 100) will begin in August 2020 in two schools of a comparable demographic, with N= 50 in the intervention school, and N= 50 in a waiting list control school. Reading and wellbeing assessments will be conducted prior to and immediately post RTD, and four weeks after RTD to measure sustained changes. The reading assessments include New Group Reading Test, Motivation to Read Profile (Gambrell et al., 1995), as well as reading frequency and reading anxiety assessments specifically designed for the study. Wellbeing assessments include Goodman’s SDQ, (1997) and pupil self-reporting questionnaires specifically designed for the study. Child, class teacher, and parent questionnaires and interviews prior to, during and post RTD will be conducted to measure perceptions of the impact of RTD on mood and motivation towards reading. This study will make a substantial contribution to our understanding of the effectiveness of RTD and thus have consequences for the fields of education and anthrozoology.Keywords: animal assisted intervention, reading to dogs, welfare, wellbeing
Procedia PDF Downloads 1782861 An Approach to Physical Performance Analysis for Judo
Authors: Stefano Frassinelli, Alessandro Niccolai, Riccardo E. Zich
Abstract:
Sport performance analysis is a technique that is becoming every year more important for athletes of every level. Many techniques have been developed to measure and analyse efficiently the performance of athletes in some sports, but in combat sports these techniques found in many times their limits, due to the high interaction between the two opponents during the competition. In this paper the problem will be framed. Moreover the physical performance measurement problem will be analysed and three different techniques to manage it will be presented. All the techniques have been used to analyse the performance of 22 high level Judo athletes.Keywords: sport performance, physical performance, judo, performance coefficients
Procedia PDF Downloads 4132860 Quantification of Dispersion Effects in Arterial Spin Labelling Perfusion MRI
Authors: Rutej R. Mehta, Michael A. Chappell
Abstract:
Introduction: Arterial spin labelling (ASL) is an increasingly popular perfusion MRI technique, in which arterial blood water is magnetically labelled in the neck before flowing into the brain, providing a non-invasive measure of cerebral blood flow (CBF). The accuracy of ASL CBF measurements, however, is hampered by dispersion effects; the distortion of the ASL labelled bolus during its transit through the vasculature. In spite of this, the current recommended implementation of ASL – the white paper (Alsop et al., MRM, 73.1 (2015): 102-116) – does not account for dispersion, which leads to the introduction of errors in CBF. Given that the transport time from the labelling region to the tissue – the arterial transit time (ATT) – depends on the region of the brain and the condition of the patient, it is likely that these errors will also vary with the ATT. In this study, various dispersion models are assessed in comparison with the white paper (WP) formula for CBF quantification, enabling the errors introduced by the WP to be quantified. Additionally, this study examines the relationship between the errors associated with the WP and the ATT – and how this is influenced by dispersion. Methods: Data were simulated using the standard model for pseudo-continuous ASL, along with various dispersion models, and then quantified using the formula in the WP. The ATT was varied from 0.5s-1.3s, and the errors associated with noise artefacts were computed in order to define the concept of significant error. The instantaneous slope of the error was also computed as an indicator of the sensitivity of the error with fluctuations in ATT. Finally, a regression analysis was performed to obtain the mean error against ATT. Results: An error of 20.9% was found to be comparable to that introduced by typical measurement noise. The WP formula was shown to introduce errors exceeding 20.9% for ATTs beyond 1.25s even when dispersion effects were ignored. Using a Gaussian dispersion model, a mean error of 16% was introduced by using the WP, and a dispersion threshold of σ=0.6 was determined, beyond which the error was found to increase considerably with ATT. The mean error ranged from 44.5% to 73.5% when other physiologically plausible dispersion models were implemented, and the instantaneous slope varied from 35 to 75 as dispersion levels were varied. Conclusion: It has been shown that the WP quantification formula holds only within an ATT window of 0.5 to 1.25s, and that this window gets narrower as dispersion occurs. Provided that the dispersion levels fall below the threshold evaluated in this study, however, the WP can measure CBF with reasonable accuracy if dispersion is correctly modelled by the Gaussian model. However, substantial errors were observed with other common models for dispersion with dispersion levels similar to those that have been observed in literature.Keywords: arterial spin labelling, dispersion, MRI, perfusion
Procedia PDF Downloads 3712859 A Regression Model for Predicting Sugar Crystal Size in a Fed-Batch Vacuum Evaporative Crystallizer
Authors: Sunday B. Alabi, Edikan P. Felix, Aniediong M. Umo
Abstract:
Crystal size distribution is of great importance in the sugar factories. It determines the market value of granulated sugar and also influences the cost of production of sugar crystals. Typically, sugar is produced using fed-batch vacuum evaporative crystallizer. The crystallization quality is examined by crystal size distribution at the end of the process which is quantified by two parameters: the average crystal size of the distribution in the mean aperture (MA) and the width of the distribution of the coefficient of variation (CV). Lack of real-time measurement of the sugar crystal size hinders its feedback control and eventual optimisation of the crystallization process. An attractive alternative is to use a soft sensor (model-based method) for online estimation of the sugar crystal size. Unfortunately, the available models for sugar crystallization process are not suitable as they do not contain variables that can be measured easily online. The main contribution of this paper is the development of a regression model for estimating the sugar crystal size as a function of input variables which are easy to measure online. This has the potential to provide real-time estimates of crystal size for its effective feedback control. Using 7 input variables namely: initial crystal size (Lo), temperature (T), vacuum pressure (P), feed flowrate (Ff), steam flowrate (Fs), initial super-saturation (S0) and crystallization time (t), preliminary studies were carried out using Minitab 14 statistical software. Based on the existing sugar crystallizer models, and the typical ranges of these 7 input variables, 128 datasets were obtained from a 2-level factorial experimental design. These datasets were used to obtain a simple but online-implementable 6-input crystal size model. It seems the initial crystal size (Lₒ) does not play a significant role. The goodness of the resulting regression model was evaluated. The coefficient of determination, R² was obtained as 0.994, and the maximum absolute relative error (MARE) was obtained as 4.6%. The high R² (~1.0) and the reasonably low MARE values are an indication that the model is able to predict sugar crystal size accurately as a function of the 6 easy-to-measure online variables. Thus, the model can be used as a soft sensor to provide real-time estimates of sugar crystal size during sugar crystallization process in a fed-batch vacuum evaporative crystallizer.Keywords: crystal size, regression model, soft sensor, sugar, vacuum evaporative crystallizer
Procedia PDF Downloads 2082858 Measuring the Cavitation Cloud by Electrical Impedance Tomography
Authors: Michal Malik, Jiri Primas, Darina Jasikova, Michal Kotek, Vaclav Kopecky
Abstract:
This paper is a case study dealing with the viability of using Electrical Impedance Tomography for measuring cavitation clouds in a pipe setup. The authors used a simple passive cavitation generator to cause a cavitation cloud, which was then recorded for multiple flow rates using electrodes in two measuring planes. The paper presents the results of the experiment, showing the used industrial grade tomography system ITS p2+ is able to measure the cavitation cloud and may be particularly useful for identifying the inception of cavitation in setups where other measuring tools may not be viable.Keywords: cavitation cloud, conductivity measurement, electrical impedance tomography, mechanically induced cavitation
Procedia PDF Downloads 2482857 Traverse Surveying Table Simple and Sure
Authors: Hamid Fallah
Abstract:
Creating surveying stations is the first thing that a surveyor learns; they can use it for control and implementation in projects such as buildings, roads, tunnels, monitoring, etc., whatever is related to the preparation of maps. In this article, the method of calculation through the traverse table and by checking several examples of errors of several publishers of surveying books in the calculations of this table, we also control the results of several software in a simple way. Surveyors measure angles and lengths in creating surveying stations, so the most important task of a surveyor is to be able to correctly remove the error of angles and lengths from the calculations and to determine whether the amount of error is within the permissible limit for delete it or not.Keywords: UTM, localization, scale factor, cartesian, traverse
Procedia PDF Downloads 822856 AI-Based Autonomous Plant Health Monitoring and Control System with Visual Health-Scoring Models
Authors: Uvais Qidwai, Amor Moursi, Mohamed Tahar, Malek Hamad, Hamad Alansi
Abstract:
This paper focuses on the development and implementation of an advanced plant health monitoring system with an AI backbone and IoT sensory network. Our approach involves addressing the critical environmental factors essential for preserving a plant’s well-being, including air temperature, soil moisture, soil temperature, soil conductivity, pH, water levels, and humidity, as well as the presence of essential nutrients like nitrogen, phosphorus, and potassium. Central to our methodology is the utilization of computer vision technology, particularly a night vision camera. The captured data is then compared against a reference database containing different health statuses. This comparative analysis is implemented using an AI deep learning model, which enables us to generate accurate assessments of plant health status. By combining the AI-based decision-making approach, our system aims to provide precise and timely insights into the overall health and well-being of plants, offering a valuable tool for effective plant care and management.Keywords: deep learning image model, IoT sensing, cloud-based analysis, remote monitoring app, computer vision, fuzzy control
Procedia PDF Downloads 542855 Gaze Behaviour of Individuals with and without Intellectual Disability for Nonaccidental and Metric Shape Properties
Authors: S. Haider, B. Bhushan
Abstract:
Eye Gaze behaviour of individuals with and without intellectual disability are investigated in an eye tracking study in terms of sensitivity to Nonaccidental (NAPs) and Metric (MPs) shape properties. Total fixation time is used as an indirect measure of attention allocation. Studies have found Mean reaction times for non accidental properties (NAPs) to be shorter than for metric (MPs) when the MP and NAP differences were equalized. METHODS: Twenty-five individuals with intellectual disability (mild and moderate level of Mental Retardation) and twenty-seven normal individuals were compared on mean total fixation duration, accuracy level and mean reaction time for mild NAPs, extreme NAPs and metric properties of images. 2D images of cylinders were adapted and made into forced choice match-to-sample tasks. Tobii TX300 Eye Tracker was used to record total fixation duration and data obtained from the Areas of Interest (AOI). Variable trial duration (total reaction time of each participant) and fixed trail duration (data taken at each second from one to fifteen seconds) data were used for analyses. Both groups did not differ in terms of fixation times (fixed as well as variable) across any of the three image manipulations but differed in terms of reaction time and accuracy. Normal individuals had longer reaction time compared to individuals with intellectual disability across all types of images. Both the groups differed significantly on accuracy measure across all image types. Normal individuals performed better across all three types of images. Mild NAPs vs. Metric differences: There was significant difference between mild NAPs and metric properties of images in terms of reaction times. Mild NAPs images had significantly longer reaction time compared to metric for normal individuals but this difference was not found for individuals with intellectual disability. Mild NAPs images had significantly better accuracy level compared to metric for both the groups. In conclusion, type of image manipulations did not result in differences in attention allocation for individuals with and without intellectual disability. Mild Nonaccidental properties facilitate better accuracy level compared to metric in both the groups but this advantage is seen only for normal group in terms of mean reaction time.Keywords: eye gaze fixations, eye movements, intellectual disability, stimulus properties
Procedia PDF Downloads 5532854 The Roles, Strategic Coordination, and Alignment of CTOs: A Systematic Literature Review
Authors: Shailendra Natraj, Kristin Paetzold, B. R. Katzy
Abstract:
The significant role of technology in strategic business decisions has created the need for executives who understand technology and recognize profitable applications to products, services and processes. The role of CTO’s is very complex within technology-based firms, which stretches from the technology aspects to the strategic goal and vision of the firm. Often the roles of CTOs scales from as functional leaders, strategic leaders or supera- functional leaders. In most of the companies the roles are unclear and fuzzy. We in our research are trying to explore each of the orientation and link between leadership types (functional, strategic and super functional) of CTOs, responsibilities, credibility and strategic and conceptual responsibilities. Approach: We conducted a comprehensive literature review with the available databank sources. Results: From the conducted literature review we could identify that most of the research work conducted so far were mainly distributed between roles and responsibilities of CTOs. The available sources pointed were limited to roles of CTOs as functional leaders. Contribution: In our findings based on the literature review, we could identify that apart from the conducted research what so far has not been focused yet are (a) The leadership types (mainly) strategic and super-functional leaders) of CTOs, (b) the responsibilities and credibility of CTOs and (c) the strategic and conceptual responsibilities of CTOs.Keywords: CTO, chief technology officer, strategy, technology leaders
Procedia PDF Downloads 512