Search results for: resistance-capacitance network model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19676

Search results for: resistance-capacitance network model

15986 A Methodology of Using Fuzzy Logics and Data Analytics to Estimate the Life Cycle Indicators of Solar Photovoltaics

Authors: Thor Alexis Sazon, Alexander Guzman-Urbina, Yasuhiro Fukushima

Abstract:

This study outlines the method of how to develop a surrogate life cycle model based on fuzzy logic using three fuzzy inference methods: (1) the conventional Fuzzy Inference System (FIS), (2) the hybrid system of Data Analytics and Fuzzy Inference (DAFIS), which uses data clustering for defining the membership functions, and (3) the Adaptive-Neuro Fuzzy Inference System (ANFIS), a combination of fuzzy inference and artificial neural network. These methods were demonstrated with a case study where the Global Warming Potential (GWP) and the Levelized Cost of Energy (LCOE) of solar photovoltaic (PV) were estimated using Solar Irradiation, Module Efficiency, and Performance Ratio as inputs. The effects of using different fuzzy inference types, either Sugeno- or Mamdani-type, and of changing the number of input membership functions to the error between the calibration data and the model-generated outputs were also illustrated. The solution spaces of the three methods were consequently examined with a sensitivity analysis. ANFIS exhibited the lowest error while DAFIS gave slightly lower errors compared to FIS. Increasing the number of input membership functions helped with error reduction in some cases but, at times, resulted in the opposite. Sugeno-type models gave errors that are slightly lower than those of the Mamdani-type. While ANFIS is superior in terms of error minimization, it could generate solutions that are questionable, i.e. the negative GWP values of the Solar PV system when the inputs were all at the upper end of their range. This shows that the applicability of the ANFIS models highly depends on the range of cases at which it was calibrated. FIS and DAFIS generated more intuitive trends in the sensitivity runs. DAFIS demonstrated an optimal design point wherein increasing the input values does not improve the GWP and LCOE anymore. In the absence of data that could be used for calibration, conventional FIS presents a knowledge-based model that could be used for prediction. In the PV case study, conventional FIS generated errors that are just slightly higher than those of DAFIS. The inherent complexity of a Life Cycle study often hinders its widespread use in the industry and policy-making sectors. While the methodology does not guarantee a more accurate result compared to those generated by the Life Cycle Methodology, it does provide a relatively simpler way of generating knowledge- and data-based estimates that could be used during the initial design of a system.

Keywords: solar photovoltaic, fuzzy logic, inference system, artificial neural networks

Procedia PDF Downloads 158
15985 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data

Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau

Abstract:

Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.

Keywords: calcium imaging, computer vision, neural activity, neural networks

Procedia PDF Downloads 78
15984 Learning Algorithms for Fuzzy Inference Systems Composed of Double- and Single-Input Rule Modules

Authors: Hirofumi Miyajima, Kazuya Kishida, Noritaka Shigei, Hiromi Miyajima

Abstract:

Most of self-tuning fuzzy systems, which are automatically constructed from learning data, are based on the steepest descent method (SDM). However, this approach often requires a large convergence time and gets stuck into a shallow local minimum. One of its solutions is to use fuzzy rule modules with a small number of inputs such as DIRMs (Double-Input Rule Modules) and SIRMs (Single-Input Rule Modules). In this paper, we consider a (generalized) DIRMs model composed of double and single-input rule modules. Further, in order to reduce the redundant modules for the (generalized) DIRMs model, pruning and generative learning algorithms for the model are suggested. In order to show the effectiveness of them, numerical simulations for function approximation, Box-Jenkins and obstacle avoidance problems are performed.

Keywords: Box-Jenkins's problem, double-input rule module, fuzzy inference model, obstacle avoidance, single-input rule module

Procedia PDF Downloads 349
15983 Improving Reading Comprehension Skills of Elementary School Students through Cooperative Integrated Reading and Composition Model Using Padlet

Authors: Neneng Hayatul Milah

Abstract:

The most important reading skill for students is comprehension. Understanding the reading text will have an impact on learning outcomes. However, reading comprehension instruction in Indonesian elementary schools is lacking. A more effective learning model is needed to enhance students' reading comprehension. This study aimed to evaluate the effectiveness of the CIRC (Cooperative Integrated Reading and Composition) model with Padlet integration in improving the reading comprehension skills of grade IV students in elementary schools in Cimahi City, Indonesia. This research methodology was quantitative with a pre-experiment research type and one group pretest-posttest research design. The sample of this study consisted of 30 students. The results of statistical analysis showed that there was a significant effect of using the CIRC learning model using Padlet on improving students' reading comprehension skills of narrative text. The mean score of students' pretest was 67.41, while the mean score of the posttest increased to 84.82. The paired sample t-test resulted in a t-count score of -13.706 with a significance score of <0.001, which is smaller than α = 0.05. This research is expected to provide useful insights for educational practitioners on how the use of the CIRC model using Padlet can improve the reading comprehension skills of elementary school students.

Keywords: reading comprehension skills, CIRC, Padlet, narrative text

Procedia PDF Downloads 17
15982 On-Chip Sensor Ellipse Distribution Method and Equivalent Mapping Technique for Real-Time Hardware Trojan Detection and Location

Authors: Longfei Wang, Selçuk Köse

Abstract:

Hardware Trojan becomes great concern as integrated circuit (IC) technology advances and not all manufacturing steps of an IC are accomplished within one company. Real-time hardware Trojan detection is proven to be a feasible way to detect randomly activated Trojans that cannot be detected at testing stage. On-chip sensors serve as a great candidate to implement real-time hardware Trojan detection, however, the optimization of on-chip sensors has not been thoroughly investigated and the location of Trojan has not been carefully explored. On-chip sensor ellipse distribution method and equivalent mapping technique are proposed based on the characteristics of on-chip power delivery network in this paper to address the optimization and distribution of on-chip sensors for real-time hardware Trojan detection as well as to estimate the location and current consumption of hardware Trojan. Simulation results verify that hardware Trojan activation can be effectively detected and the location of a hardware Trojan can be efficiently estimated with less than 5% error for a realistic power grid using our proposed methods. The proposed techniques therefore lay a solid foundation for isolation and even deactivation of hardware Trojans through accurate location of Trojans.

Keywords: hardware trojan, on-chip sensor, power distribution network, power/ground noise

Procedia PDF Downloads 384
15981 CFD Study of Subcooled Boiling Flow at Elevated Pressure Using a Mechanistic Wall Heat Partitioning Model

Authors: Machimontorn Promtong, Sherman C. P. Cheung, Guan H. Yeoh, Sara Vahaji, Jiyuan Tu

Abstract:

The wide range of industrial applications involved with boiling flows promotes the necessity of establishing fundamental knowledge in boiling flow phenomena. For this purpose, a number of experimental and numerical researches have been performed to elucidate the underlying physics of this flow. In this paper, the improved wall boiling models, implemented on ANSYS CFX 14.5, were introduced to study subcooled boiling flow at elevated pressure. At the heated wall boundary, the Fractal model, Force balance approach and Mechanistic frequency model are given for predicting the nucleation site density, bubble departure diameter, and bubble departure frequency. The presented wall heat flux partitioning closures were modified to consider the influence of bubble sliding along the wall before the lift-off, which usually happens in the flow boiling. The simulation was performed based on the Two-fluid model, where the standard k-ω SST model was selected for turbulence modelling. Existing experimental data at around 5 bars were chosen to evaluate the accuracy of the presented mechanistic approach. The void fraction and Interfacial Area Concentration (IAC) are in good agreement with the experimental data. However, the predicted bubble velocity and Sauter Mean Diameter (SMD) are over-predicted. This over-prediction may be caused by consideration of only dispersed and spherical bubbles in the simulations. In the future work, the important physical mechanisms of bubbles, such as merging and shrinking during sliding on the heated wall will be incorporated into this mechanistic model to enhance its capability for a wider range of flow prediction.

Keywords: subcooled boiling flow, computational fluid dynamics (CFD), mechanistic approach, two-fluid model

Procedia PDF Downloads 314
15980 Multi-Objective Multi-Period Allocation of Temporary Earthquake Disaster Response Facilities with Multi-Commodities

Authors: Abolghasem Yousefi-Babadi, Ali Bozorgi-Amiri, Aida Kazempour, Reza Tavakkoli-Moghaddam, Maryam Irani

Abstract:

All over the world, natural disasters (e.g., earthquakes, floods, volcanoes and hurricanes) causes a lot of deaths. Earthquakes are introduced as catastrophic events, which is accident by unusual phenomena leading to much loss around the world. Such could be replaced by disasters or any other synonyms strongly demand great long-term help and relief, which can be hard to be managed. Supplies and facilities are very important challenges after any earthquake which should be prepared for the disaster regions to satisfy the people's demands who are suffering from earthquake. This paper proposed disaster response facility allocation problem for disaster relief operations as a mathematical programming model. Not only damaged people in the earthquake victims, need the consumable commodities (e.g., food and water), but also they need non-consumable commodities (e.g., clothes) to protect themselves. Therefore, it is concluded that paying attention to disaster points and people's demands are very necessary. To deal with this objective, both commodities including consumable and need non-consumable commodities are considered in the presented model. This paper presented the multi-objective multi-period mathematical programming model regarding the minimizing the average of the weighted response times and minimizing the total operational cost and penalty costs of unmet demand and unused commodities simultaneously. Furthermore, a Chebycheff multi-objective solution procedure as a powerful solution algorithm is applied to solve the proposed model. Finally, to illustrate the model applicability, a case study of the Tehran earthquake is studied, also to show model validation a sensitivity analysis is carried out.

Keywords: facility location, multi-objective model, disaster response, commodity

Procedia PDF Downloads 253
15979 The Value of Store Choice Criteria on Perceived Patronage Intentions

Authors: Susana Marques

Abstract:

Research on how store environment cues influence consumers’ store choice decision criteria, such as store operations, product quality, monetary price, store image and sales promotion, is sparse. Especially absent research on the simultaneous impact of multiple store environment cues. The authors propose a comprehensive store choice model that includes: three types of store environment cues as exogenous constructs; various store choice criteria as possible mediating constructs, and store patronage intentions as an endogenous construct. On the basis of testing with a sample of 561 customers of hypermarkets, the model is partially supported. This study used structural equation modelling to test the proposed model.

Keywords: store choice, store patronage, structural equation modelling, retailing

Procedia PDF Downloads 265
15978 A Comparative Study on ANN, ANFIS and SVM Methods for Computing Resonant Frequency of A-Shaped Compact Microstrip Antennas

Authors: Ahmet Kayabasi, Ali Akdagli

Abstract:

In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.

Keywords: a-shaped compact microstrip antenna, artificial neural network (ANN), adaptive neuro-fuzzy inference system (ANFIS), support vector machine (SVM)

Procedia PDF Downloads 436
15977 The Circularity of Re-Refined Used Motor Oils: Measuring Impacts and Ensuring Responsible Procurement

Authors: Farah Kanani

Abstract:

Blue Tide Environmental is a company focused on developing a network of used motor oil recycling facilities across the U.S. They initiated the redesign of its recycling plant in Texas, and aimed to establish an updated carbon footprint of re-refined used motor oils compared to an equivalent product derived from virgin stock that is not re-refined. The aim was to quantify emissions savings of a circular alternative to conventional end-of-life combustion of used motor oil (UMO). To do so, they mandated an ISO-compliant carbon footprint, utilizing complex models requiring geographical and temporal accuracy to accommodate the U.S. refinery market. The quantification of linear and circular flows, proxies for fuel substitution and system expansion for multi-product outputs were all critical methodological choices and were tested through sensitivity analyses. The re-refined system consisted of continuous recycling of UMO and thus, end-of-life is considered non-existent. The unique perspective to this topic will be from a life cycle i.e. holistic one and essentially demonstrate using this example of how a cradle-to-cradle model can be used to quantify a comparative carbon footprint. The intended audience is lubricant manufacturers as the consumers, motor oil industry professionals and other industry members interested in performing a cradle-to-cradle modeling.

Keywords: circularity, used motor oil, re-refining, systems expansion

Procedia PDF Downloads 28
15976 Development of an Artificial Neural Network to Measure Science Literacy Leveraging Neuroscience

Authors: Amanda Kavner, Richard Lamb

Abstract:

Faster growth in science and technology of other nations may make staying globally competitive more difficult without shifting focus on how science is taught in US classes. An integral part of learning science involves visual and spatial thinking since complex, and real-world phenomena are often expressed in visual, symbolic, and concrete modes. The primary barrier to spatial thinking and visual literacy in Science, Technology, Engineering, and Math (STEM) fields is representational competence, which includes the ability to generate, transform, analyze and explain representations, as opposed to generic spatial ability. Although the relationship is known between the foundational visual literacy and the domain-specific science literacy, science literacy as a function of science learning is still not well understood. Moreover, the need for a more reliable measure is necessary to design resources which enhance the fundamental visuospatial cognitive processes behind scientific literacy. To support the improvement of students’ representational competence, first visualization skills necessary to process these science representations needed to be identified, which necessitates the development of an instrument to quantitatively measure visual literacy. With such a measure, schools, teachers, and curriculum designers can target the individual skills necessary to improve students’ visual literacy, thereby increasing science achievement. This project details the development of an artificial neural network capable of measuring science literacy using functional Near-Infrared Spectroscopy (fNIR) data. This data was previously collected by Project LENS standing for Leveraging Expertise in Neurotechnologies, a Science of Learning Collaborative Network (SL-CN) of scholars of STEM Education from three US universities (NSF award 1540888), utilizing mental rotation tasks, to assess student visual literacy. Hemodynamic response data from fNIRsoft was exported as an Excel file, with 80 of both 2D Wedge and Dash models (dash) and 3D Stick and Ball models (BL). Complexity data were in an Excel workbook separated by the participant (ID), containing information for both types of tasks. After changing strings to numbers for analysis, spreadsheets with measurement data and complexity data were uploaded to RapidMiner’s TurboPrep and merged. Using RapidMiner Studio, a Gradient Boosted Trees artificial neural network (ANN) consisting of 140 trees with a maximum depth of 7 branches was developed, and 99.7% of the ANN predictions are accurate. The ANN determined the biggest predictors to a successful mental rotation are the individual problem number, the response time and fNIR optode #16, located along the right prefrontal cortex important in processing visuospatial working memory and episodic memory retrieval; both vital for science literacy. With an unbiased measurement of science literacy provided by psychophysiological measurements with an ANN for analysis, educators and curriculum designers will be able to create targeted classroom resources to help improve student visuospatial literacy, therefore improving science literacy.

Keywords: artificial intelligence, artificial neural network, machine learning, science literacy, neuroscience

Procedia PDF Downloads 116
15975 The Growth Curve of Gompertz Model in Body Weight of Slovak Mixed-Sex Goose Breeds

Authors: Cyril Hrncar, Jozef Bujko, Widya P. B. Putra

Abstract:

The growth curve of poultry is important to evaluate the farming management system. This study was aimed to estimate the growth curve of body weight in goose. The growth curve in this study was estimated with non-linear Gompertz model through CurveExpert 1.4. software. Three Slovak mixed-sex goose breeds of Landes (L), Pomeranian (P) and Steinbacher (S) were used in this study. Total of 28 geese (10 L, 8 P and 10 S) were used to estimate the growth curve. Research showed that the asymptotic weight (A) in those geese were reached of 5332.51 g (L), 6186.14 g (P) and 5048.27 g (S). Thus, the maturing rate (k) in each breed were similar (0.05 g/day). The weight of inflection was reached of 1960.48 g (L), 2274.32 g (P) and 1855.98 g (S). The time of inflection (ti) was reached of 25.6 days (L), 26.2 days (P) and 27.80 days (S). The maximum growth rate (MGR) was reached of 98.02 g/day (L), 113.72 g/day (P) and 92.80 g/day (S). Hence, the coefficient of determination (R2) in Gompertz model was 0.99 for each breed. It can be concluded that Pomeranian geese had highest of growth trait than the other breeds.

Keywords: body weight, growth curve, inflection, Slovak geese, Gompertz model

Procedia PDF Downloads 138
15974 The Ecological Role of Loligo forbesii in the Moray Firth Ecosystem, Northeast Scotland

Authors: Godwin A. Otogo, Sansanee Wangvoralak, Graham J. Pierce, Lee C. Hastie, Beth Scott

Abstract:

The squid Loligo forbesii is suspected to be an important species in marine food webs, as it can strongly impact its prey and be impacted upon by predation, competition, fishing and/or climate variability. To quantify these impacts in the food web, the measurement of its trophic position and ecological role within well-studied ecosystems is essential. An Ecopath model was balanced and run for the Moray Firth ecosystem and was used to investigate the significance of this squid’s trophic roles. The network analysis routine included in Ecopath with Ecosim (EwE) was used to estimate trophic interaction, system indicators (health condition and developmental stage) and food web features. Results indicated that within the Moray Firth squid occupy a top trophic position in the food web and also a major prey item for many other species. Results from Omnivory Index (OI) showed that squid is a generalized feeder transferring energy across wide trophic levels and is more important as a predator than that as a prey in the Moray Firth ecosystem. The results highlight the importance of taking squid into account in the management of Europe’s living marine resources.

Keywords: Squid, Loligo forbesii, Ecopath, Moray Firth, Trophic level

Procedia PDF Downloads 471
15973 Food Security Model and the Role of Community Empowerment: The Case of a Marginalized Village in Mexico, Tatoxcac, Puebla

Authors: Marco Antonio Lara De la Calleja, María Catalina Ovando Chico, Eduardo Lopez Ruiz

Abstract:

Community empowerment has been proved to be a key element in the solution of the food security problem. As a result of a conceptual analysis, it was found that agricultural production, economic development and governance, are the traditional basis of food security models. Although the literature points to social inclusion as an important factor for food security, no model has considered it as the basis of it. The aim of this research is to identify different dimensions that make an integral model for food security, with emphasis on community empowerment. A diagnosis was made in the study community (Tatoxcac, Zacapoaxtla, Puebla), to know the aspects that impact the level of food insecurity. With a statistical sample integrated by 200 families, the Latin American and Caribbean Food Security Scale (ELCSA) was applied, finding that: in households composed by adults and children, have moderated food insecurity, (ELCSA scale has three levels, low, moderated and high); that result is produced mainly by the economic income capacity and the diversity of the diet on its food. With that being said, a model was developed to promote food security through five dimensions: 1. Regional context of the community; 2. Structure and system of local food; 3. Health and nutrition; 4. Information and technology access; and 5. Self-awareness and empowerment. The specific actions on each axis of the model, allowed a systemic approach needed to attend food security in the community, through the empowerment of society. It is concluded that the self-awareness of local communities is an area of extreme importance, which must be taken into account for participatory schemes to improve food security. In the long term, the model requires the integrated participation of different actors, such as government, companies and universities, to solve something such vital as food security.

Keywords: community empowerment, food security, model, systemic approach

Procedia PDF Downloads 367
15972 Speckle-Based Phase Contrast Micro-Computed Tomography with Neural Network Reconstruction

Authors: Y. Zheng, M. Busi, A. F. Pedersen, M. A. Beltran, C. Gundlach

Abstract:

X-ray phase contrast imaging has shown to yield a better contrast compared to conventional attenuation X-ray imaging, especially for soft tissues in the medical imaging energy range. This can potentially lead to better diagnosis for patients. However, phase contrast imaging has mainly been performed using highly brilliant Synchrotron radiation, as it requires high coherence X-rays. Many research teams have demonstrated that it is also feasible using a laboratory source, bringing it one step closer to clinical use. Nevertheless, the requirement of fine gratings and high precision stepping motors when using a laboratory source prevents it from being widely used. Recently, a random phase object has been proposed as an analyzer. This method requires a much less robust experimental setup. However, previous studies were done using a particular X-ray source (liquid-metal jet micro-focus source) or high precision motors for stepping. We have been working on a much simpler setup with just small modification of a commercial bench-top micro-CT (computed tomography) scanner, by introducing a piece of sandpaper as the phase analyzer in front of the X-ray source. However, it needs a suitable algorithm for speckle tracking and 3D reconstructions. The precision and sensitivity of speckle tracking algorithm determine the resolution of the system, while the 3D reconstruction algorithm will affect the minimum number of projections required, thus limiting the temporal resolution. As phase contrast imaging methods usually require much longer exposure time than traditional absorption based X-ray imaging technologies, a dynamic phase contrast micro-CT with a high temporal resolution is particularly challenging. Different reconstruction methods, including neural network based techniques, will be evaluated in this project to increase the temporal resolution of the phase contrast micro-CT. A Monte Carlo ray tracing simulation (McXtrace) was used to generate a large dataset to train the neural network, in order to address the issue that neural networks require large amount of training data to get high-quality reconstructions.

Keywords: micro-ct, neural networks, reconstruction, speckle-based x-ray phase contrast

Procedia PDF Downloads 254
15971 A Comprehensive Metamodel of an Urbanized Information System: Experimental Case

Authors: Leila Trabelsi

Abstract:

The urbanization of Information Systems (IS) is an effective approach to master the complexity of the organization. It strengthens the coherence of IS and aligns it with the business strategy. Moreover, this approach has significant advantages such as reducing Information Technologies (IT) costs, enhancing the IS position in a competitive environment and ensuring the scalability of the IS through the integration of technological innovations. Therefore, the urbanization is considered as a business strategic decision. Thus, its embedding becomes a necessity in order to improve the IS practice. However, there is a lack of experimental cases studying meta-modelling of Urbanized Information System (UIS). The aim of this paper addresses new urbanization content meta-model which permits modelling, testing and taking into consideration organizational aspects. This methodological framework is structured according to two main abstraction levels, a conceptual level and an operational level. For each of these levels, different models are proposed and presented. The proposed model for has been empirically tested on company. The findings of this paper present an experimental study of urbanization meta-model. The paper points out the significant relationships between dimensions and their evolution.

Keywords: urbanization, information systems, enterprise architecture, meta-model

Procedia PDF Downloads 433
15970 Exploring the Factors Affecting the Intention of Using Mobile Phone E-Book by TAM and IDT

Authors: Yen-Ku Kuo, Chie-Bein Chen, Jyh-Yi Shih, Kuang-Yi Lin, Chien-Han Peng

Abstract:

This study is primarily concerned with exploring what factors affect the consumer’s intention of using mobile phone e-book. In developing research structure, we adopted technology acceptance model (TAM) and Innovation Diffusion Theory (IDT) as a foundation. The analysis method of structural equation model (SEM) was used to carry out this study. Subjects were 261 users who are using or used the mobile phone e-book. The findings can be summed up as follows: (1) The subjective norm and job relevance has non-significant and positive influence to the perceived usefulness. This represents now the user are still in a small number and most of them used it in non-work related purpose. (2) The output quality, result demonstrability and perceived ease of use were confirmed to have positive and significant influence to the perceived usefulness. (3) The moderator “innovative diffusion” affects the relationship between the attitude and behavior intention. These findings could be a reference for the practice and future study to make further exploration.

Keywords: mobile phone e-book, technology acceptance model (TAM), innovation diffusion theory (IDT), structural equation model (SEM)

Procedia PDF Downloads 502
15969 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study

Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy

Abstract:

Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.

Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy

Procedia PDF Downloads 118
15968 Stability Analysis and Experimental Evaluation on Maxwell Model of Impedance Control

Authors: Le Fu, Rui Wu, Gang Feng Liu, Jie Zhao

Abstract:

Normally, impedance control methods are based on a model that connects a spring and damper in parallel. The series connection, namely the Maxwell model, has emerged as a counterpart and draw the attention of robotics researchers. In the theoretical analysis, it turns out that the two pattern are both equivalents to some extent, but notable differences of response characteristics exist, especially in the effect of damping viscosity. However, this novel impedance control design is lack of validation on realistic robot platforms. In this study, stability analysis and experimental evaluation are achieved using a 3-fingered Barrett® robotic hand BH8-282 endowed with tactile sensing, mounted on a torque-controlled lightweight and collaborative robot KUKA® LBR iiwa 14 R820. Object handover and incoming objects catching tasks are executed for validation and analysis. Experimental results show that the series connection pattern has much better performance in natural impact or shock absorption, which indicate promising applications in robots’ safe and physical interaction with humans and objects in various environments.

Keywords: impedance control, Maxwell model, force control, dexterous manipulation

Procedia PDF Downloads 495
15967 The Study of Chitosan beads Adsorption Properties for the Removal of Heavy Metals

Authors: Peter O. Osifo, Hein W. J. P. Neomagus

Abstract:

In this study, a predicted pH model was used to determine adsorption equilibrium properties of copper, lead, zinc and cadmium. Chitosan was prepared from the exoskeleton of Cape rock-lobsters, collected from the surroundings of Cape Town, South Africa. The beads were cross-linked with gluteraldehyde to restore its chemical stability in acid media. The chitosan beads were characterized; the beads water contents and pKa varied in the range of 90-96% and 4.3-6.0 respectively and the degree of crosslinking for the beads was 18%. A pH-model, which described the reversibility of the metal adsorbed onto the beads, was used to predict the equilibrium properties of copper, lead, zinc and cadmium adsorption onto the cross-linked beads. The model accounts for the effect of pH and the important model parameters; the equilibrium adsorption constant (Kads) and to a lesser extent the adsorbent adsorption capacity (qmax). The adsorption equilibrium constant for copper, lead, zinc and cadmium were found to be 2.58×10-3, 2.22×0-3, 9.55×0-3, and 4.79×0-3, respectively. The adsorbent maximum capacity was determined to be 4.2 mmol/g.

Keywords: chitosan beads, adsorption, heavy metals, waste water

Procedia PDF Downloads 373
15966 Investigation Bubble Growth and Nucleation Rates during the Pool Boiling Heat Transfer of Distilled Water Using Population Balance Model

Authors: V. Nikkhah Rashidabad, M. Manteghian, M. Masoumi, S. Mousavian

Abstract:

In this research, the changes in bubbles diameter and number that may occur due to the change in heat flux of pure water during pool boiling process. For this purpose, test equipment was designed and developed to collect test data. The bubbles were graded using Caliper Screen software. To calculate the growth and nucleation rates of bubbles under different fluxes, population balance model was employed. The results show that the increase in heat flux from q=20 kw/m2 to q=102 kw/m2 raised the growth and nucleation rates of bubbles.

Keywords: heat flux, bubble growth, bubble nucleation, population balance model

Procedia PDF Downloads 471
15965 Development of Open Source Geospatial Certification Model Based on Geospatial Technology Competency Model

Authors: Tanzeel Ur Rehman Khan, Franz Josef Behr, Phillip Davis

Abstract:

Open source geospatial certifications are needed in geospatial technology education and industry sector. In parallel with proprietary software, free and open source software solutions become important in geospatial technology research and play an important role for the growth of the geospatial industry. ESRI, GISCI (GIS Certification Institute), ASPRS (American Society of Photogrammetry and remote sensing), and Meta spatial are offering certifications on proprietary and open source software. These are portfolio and competency based certifications depending on GIS Body of Knowledge (Bok). The analysis of these certification approaches might lead to the discovery of some gaps in them and will open a new way to develop certifications related to the geospatial open source (OS). This new certification will investigate the different geospatial competencies according to open source tools that help to identify geospatial professionals and strengthen the geospatial academic content. The goal of this research is to introduce a geospatial certification model based on geospatial technology competency model (GTCM).The developed certification will not only incorporate the importance of geospatial education and production of the geospatial competency-based workforce in universities and companies (private or public) as well as describe open source solutions with tools and technology. Job analysis, market analysis, survey analysis of this certification opens a new horizon for business as well.

Keywords: geospatial certification, open source, geospatial technology competency model, geoscience

Procedia PDF Downloads 549
15964 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)

Procedia PDF Downloads 233
15963 Aerodynamic Devices Development for Model Aircraft Control and Wind-Driven Bicycle

Authors: Yuta Moriyama, Tsuyoshi Yamazaki, Etsuo Morishita

Abstract:

Several aerodynamic devices currently attract engineers and research students. The plasma actuator is one of them, and it is very effective to control the flow. The actuator recovers a separated flow to an attached one. The actuator is also inversely applied to a spoiler. The model aircraft might be controlled by this actuator. We develop a model aircraft with the plasma actuator. Another interesting device is the Wells turbine which rotates in one direction. The present authors propose a bicycle with the Wells turbine in the wheels. Power reduction is measured when the turbine is driven by an electric motor at the exit of a wind tunnel. Several Watts power reduction might be possible. This means that the torque of the bike can be augmented by the turbine in the cross wind. These devices are tested in the wind tunnel with a three-component balance and the aerodynamic forces and moment are obtained. In this paper, we introduce these devices and their aerodynamic characteristics. The control force and moment of the plasma actuator are clarified and the power reduction of the bicycle is quantified.

Keywords: aerodynamics, model aircraft, plasma actuator, Wells turbine

Procedia PDF Downloads 238
15962 The SBO/LOCA Analysis of TRACE/SNAP for Kuosheng Nuclear Power Plant

Authors: J. R. Wang, H. T. Lin, Y. Chiang, H. C. Chen, C. Shih

Abstract:

Kuosheng Nuclear Power Plant (NPP) is located on the northern coast of Taiwan. Its nuclear steam supply system is a type of BWR/6 designed and built by General Electric on a twin unit concept. First, the methodology of Kuosheng NPP SPU (Stretch Power Uprate) safety analysis TRACE/SNAP model was developed in this research. Then, in order to estimate the safety of Kuosheng NPP under the more severe condition, the SBO (Station Blackout) + LOCA (Loss-of-Coolant Accident) transient analysis of Kuosheng NPP SPU TRACE/SNAP model was performed. Besides, the animation model of Kuosheng NPP was presented using the animation function of SNAP with TRACE/SNAP analysis results.

Keywords: TRACE, safety analysis, BWR/6, severe accident

Procedia PDF Downloads 704
15961 Edmonton Urban Growth Model as a Support Tool for the City Plan Growth Scenarios Development

Authors: Sinisa J. Vukicevic

Abstract:

Edmonton is currently one of the youngest North American cities and has achieved significant growth over the past 40 years. Strong urban shift requires a new approach to how the city is envisioned, planned, and built. This approach is evidence-based scenario development, and an urban growth model was a key support tool in framing Edmonton development strategies, developing urban policies, and assessing policy implications. The urban growth model has been developed using the Metronamica software platform. The Metronamica land use model evaluated the dynamic of land use change under the influence of key development drivers (population and employment), zoning, land suitability, and land and activity accessibility. The model was designed following the Big City Moves ideas: become greener as we grow, develop a rebuildable city, ignite a community of communities, foster a healing city, and create a city of convergence. The Big City Moves were converted to three development scenarios: ‘Strong Central City’, ‘Node City’, and ‘Corridor City’. Each scenario has a narrative story that expressed scenario’s high level goal, scenario’s approach to residential and commercial activities, to transportation vision, and employment and environmental principles. Land use demand was calculated for each scenario according to specific density targets. Spatial policies were analyzed according to their level of importance within the policy set definition for the specific scenario, but also through the policy measures. The model was calibrated on the way to reproduce known historical land use pattern. For the calibration, we used 2006 and 2011 land use data. The validation is done independently, which means we used the data we did not use for the calibration. The model was validated with 2016 data. In general, the modeling process contain three main phases: ‘from qualitative storyline to quantitative modelling’, ‘model development and model run’, and ‘from quantitative modelling to qualitative storyline’. The model also incorporates five spatial indicators: distance from residential to work, distance from residential to recreation, distance to river valley, urban expansion and habitat fragmentation. The major finding of this research could be looked at from two perspectives: the planning perspective and technology perspective. The planning perspective evaluates the model as a tool for scenario development. Using the model, we explored the land use dynamic that is influenced by a different set of policies. The model enables a direct comparison between the three scenarios. We explored the similarities and differences of scenarios and their quantitative indicators: land use change, population change (and spatial allocation), job allocation, density (population, employment, and dwelling unit), habitat connectivity, proximity to objects of interest, etc. From the technology perspective, the model showed one very important characteristic: the model flexibility. The direction for policy testing changed many times during the consultation process and model flexibility in applying all these changes was highly appreciated. The model satisfied our needs as scenario development and evaluation tool, but also as a communication tool during the consultation process.

Keywords: urban growth model, scenario development, spatial indicators, Metronamica

Procedia PDF Downloads 91
15960 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance

Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan

Abstract:

Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.

Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)

Procedia PDF Downloads 163
15959 Currency Exchange Rate Forecasts Using Quantile Regression

Authors: Yuzhi Cai

Abstract:

In this paper, we discuss a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. Together with a combining forecasts technique, we then predict USD to GBP currency exchange rates. Combined forecasts contain all the information captured by the fitted QAR models at different quantile levels and are therefore better than those obtained from individual models. Our results show that an unequally weighted combining method performs better than other forecasting methodology. We found that a median AR model can perform well in point forecasting when the predictive density functions are symmetric. However, in practice, using the median AR model alone may involve the loss of information about the data captured by other QAR models. We recommend that combined forecasts should be used whenever possible.

Keywords: combining forecasts, MCMC, predictive density functions, quantile forecasting, quantile modelling

Procedia PDF Downloads 252
15958 New Moment Rotation Model of Single Web Angle Connections

Authors: Zhengyi Kong, Seung-Eock Kim

Abstract:

Single angle connections, which are bolted to the beam web and the column flange, are studied to investigate moment-rotation behavior. Elastic–perfectly plastic material behavior is assumed. ABAQUS software is used to analyze the nonlinear behavior of a single angle connection. The same geometric and material conditions with Yanglin Gong’s test are used for verifying finite element models. Since Kishi and Chen’s Power model and Lee and Moon’s Log model are accurate only for a limited range, simpler and more accurate hyperbolic function models are proposed. The equation for calculating rotation at ultimate moment is first proposed.

Keywords: finite element method, moment and rotation, rotation at ultimate moment, single-web angle connections

Procedia PDF Downloads 420
15957 Detecting Venomous Files in IDS Using an Approach Based on Data Mining Algorithm

Authors: Sukhleen Kaur

Abstract:

In security groundwork, Intrusion Detection System (IDS) has become an important component. The IDS has received increasing attention in recent years. IDS is one of the effective way to detect different kinds of attacks and malicious codes in a network and help us to secure the network. Data mining techniques can be implemented to IDS, which analyses the large amount of data and gives better results. Data mining can contribute to improving intrusion detection by adding a level of focus to anomaly detection. So far the study has been carried out on finding the attacks but this paper detects the malicious files. Some intruders do not attack directly, but they hide some harmful code inside the files or may corrupt those file and attack the system. These files are detected according to some defined parameters which will form two lists of files as normal files and harmful files. After that data mining will be performed. In this paper a hybrid classifier has been used via Naive Bayes and Ripper classification methods. The results show how the uploaded file in the database will be tested against the parameters and then it is characterised as either normal or harmful file and after that the mining is performed. Moreover, when a user tries to mine on harmful file it will generate an exception that mining cannot be made on corrupted or harmful files.

Keywords: data mining, association, classification, clustering, decision tree, intrusion detection system, misuse detection, anomaly detection, naive Bayes, ripper

Procedia PDF Downloads 409