Search results for: national models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10879

Search results for: national models

10189 E-Consumers’ Attribute Non-Attendance Switching Behavior: Effect of Providing Information on Attributes

Authors: Leonard Maaya, Michel Meulders, Martina Vandebroek

Abstract:

Discrete Choice Experiments (DCE) are used to investigate how product attributes affect decision-makers’ choices. In DCEs, choice situations consisting of several alternatives are presented from which choice-makers select the preferred alternative. Standard multinomial logit models based on random utility theory can be used to estimate the utilities for the attributes. The overarching principle in these models is that respondents understand and use all the attributes when making choices. However, studies suggest that respondents sometimes ignore some attributes (commonly referred to as Attribute Non-Attendance/ANA). The choice modeling literature presents ANA as a static process, i.e., respondents’ ANA behavior does not change throughout the experiment. However, respondents may ignore attributes due to changing factors like availability of information on attributes, learning/fatigue in experiments, etc. We develop a dynamic mixture latent Markov model to model changes in ANA when information on attributes is provided. The model is illustrated on e-consumers’ webshop choices. The results indicate that the dynamic ANA model describes the behavioral changes better than modeling the impact of information using changes in parameters. Further, we find that providing information on attributes leads to an increase in the attendance probabilities for the investigated attributes.

Keywords: choice models, discrete choice experiments, dynamic models, e-commerce, statistical modeling

Procedia PDF Downloads 138
10188 Mathematical Models for Drug Diffusion Through the Compartments of Blood and Tissue Medium

Authors: M. A. Khanday, Aasma Rafiq, Khalid Nazir

Abstract:

This paper is an attempt to establish the mathematical models to understand the distribution of drug administration in the human body through oral and intravenous routes. Three models were formulated based on diffusion process using Fick’s principle and the law of mass action. The rate constants governing the law of mass action were used on the basis of the drug efficacy at different interfaces. The Laplace transform and eigenvalue methods were used to obtain the solution of the ordinary differential equations concerning the rate of change of concentration in different compartments viz. blood and tissue medium. The drug concentration in the different compartments has been computed using numerical parameters. The results illustrate the variation of drug concentration with respect to time using MATLAB software. It has been observed from the results that the drug concentration decreases in the first compartment and gradually increases in other subsequent compartments.

Keywords: Laplace transform, diffusion, eigenvalue method, mathematical model

Procedia PDF Downloads 332
10187 Analysis on South Korean Early Childhood Education Teachers’ Stage of Concerns about Software Education According to the Concern-Based Adoption Model

Authors: Sun-Mi Park, Ji-Hyun Jung, Min-Jung Kang

Abstract:

Software (SW) education is scheduled to be included in the National curriculum in South Korea by 2018. However, Korean national kindergarten curriculum has been excepted from the revision of the entire Korean national school curriculum including software education. Even though the SW education has not been considered a part of current national kindergarten curriculum, there is a growing interest of adopting software education into the ECE practice. Teachers might be a key element in introducing and implementing new educational change such as SW education. In preparation for the adoption of SW education in ECE, it might be necessary to figure out ECE teachers’ perception and attitudes toward early childhood software education. For this study, 219 ECE teachers’ concern level in SW education was surveyed by using the Stages of Concern Questionnaire (SoCQ). As a result, the teachers' concern level in SW education is the highest at stage 0-Unconcerned and is high level in stage 1-informational, stage 2-personal, and stage 3-management concern. Thus, a non-user pattern was mostly indicated. However, compared to a typical non-user pattern, the personal and informative concern level is slightly high. The 'tailing up' phenomenon toward stage 6-refocusing was shown. Therefore, the pattern aspect close to critical non-user ever appeared to some extent. In addition, a significant difference in concern level was shown at all stages depending on the awareness of necessity. Teachers with SW training experience showed higher intensity only at stage 0. There was statistically significant difference in stage 0 and 6 depending on the future implementation decision. These results will be utilized as a resource in building ECE teachers’ support system according to his or her concern level of SW education.

Keywords: concerns-based adoption model (CBAM), early childhood education teachers, software education, Stages of Concern (SoC)

Procedia PDF Downloads 205
10186 Deep Learning Approach for Chronic Kidney Disease Complications

Authors: Mario Isaza-Ruget, Claudia C. Colmenares-Mejia, Nancy Yomayusa, Camilo A. González, Andres Cely, Jossie Murcia

Abstract:

Quantification of risks associated with complications development from chronic kidney disease (CKD) through accurate survival models can help with patient management. A retrospective cohort that included patients diagnosed with CKD from a primary care program and followed up between 2013 and 2018 was carried out. Time-dependent and static covariates associated with demographic, clinical, and laboratory factors were included. Deep Learning (DL) survival analyzes were developed for three CKD outcomes: CKD stage progression, >25% decrease in Estimated Glomerular Filtration Rate (eGFR), and Renal Replacement Therapy (RRT). Models were evaluated and compared with Random Survival Forest (RSF) based on concordance index (C-index) metric. 2.143 patients were included. Two models were developed for each outcome, Deep Neural Network (DNN) model reported C-index=0.9867 for CKD stage progression; C-index=0.9905 for reduction in eGFR; C-index=0.9867 for RRT. Regarding the RSF model, C-index=0.6650 was reached for CKD stage progression; decreased eGFR C-index=0.6759; RRT C-index=0.8926. DNN models applied in survival analysis context with considerations of longitudinal covariates at the start of follow-up can predict renal stage progression, a significant decrease in eGFR and RRT. The success of these survival models lies in the appropriate definition of survival times and the analysis of covariates, especially those that vary over time.

Keywords: artificial intelligence, chronic kidney disease, deep neural networks, survival analysis

Procedia PDF Downloads 133
10185 Modelling Conceptual Quantities Using Support Vector Machines

Authors: Ka C. Lam, Oluwafunmibi S. Idowu

Abstract:

Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.

Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression

Procedia PDF Downloads 204
10184 Hampering The 'Right to Know': Consequences of the Excessive Interpretation of the Notion of Exemption from the Right to Information

Authors: Tomasz Lewinski

Abstract:

The right to know becomes gradually recognised as an increasing number of states adopts national legislations regarding access to state-held information. Laws differ from each other in the scope of the right to information (hereinafter: RTI). In all regimes of RTI, there are exceptions from the general notion of the right. States’ authorities too often use exceptions to justify refusals to requests for state-held information. This paper sets out how states hamper RTI basing on the notion of exception and by not providing an effective procedure that could redress unlawful denials. This paper bases on two selected examples of RTI incorporation into the national legal regime, United Kingdom, and South Africa. It succinctly outlines the international standard given in Article 19 of the International Covenant on Civil and Political Rights (hereinafter: ICCPR) and its influence on the RTI in selected countries. It shortly demonstrates as a background to further analysis the Human Rights Committee’s jurisprudence and standards articulated by successive Special Rapporteurs on freedom of opinion and expression. Subsequently, it presents a brief comparison of these standards with the regional standards, namely the African Charter on Human and Peoples' Rights and the European Convention on Human Rights. It critically discusses the regimes of exceptions in RTI legislations in respective national laws. It shows how excessive these regimes are, what implications they have for the transparency in general. Also, the objective is to divide exceptions enumerated in legislations of selected states in relation to exceptions provided in Article 19 of the ICCPR. Basing on the established division of exceptions by its natures, it compares both regimes of exceptions related to the principle of national security. That is to compare jurisprudence of domestic courts, and overview practices of states’ authorities applied to RTI requests. The paper evaluates remedies available in legislations, including contexts of the length and costs of the subsequent proceedings. This provides a general assessment of the given mechanisms and present potential risks of its ineffectiveness. The paper relies on examination of the national legislations, comments of the credible non-governmental organisations (e.g. The Public's Right to Know Principles on Freedom of Information Legislation by the Article 19, The Tshwane Principles on National Security and the Right to Information), academics and also the research of the relevant judgements delivered by domestic and international courts. Conclusion assesses whether selected countries’ legislations go in line with international law and trends, whether the jurisprudence of the regional courts provide appropriate benchmarks for national courts to address RTI issues effectively. Furthermore, it identifies the largest disadvantages of current legislations and to what outcomes it leads in domestic courts jurisprudences. In the end, it provides recommendations and policy arguments for states to improve transparency and support local organisations in their endeavours to establish more transparent states and societies.

Keywords: access to information, freedom of information, national security, right to know, transparency

Procedia PDF Downloads 213
10183 Mapping Alternative Education in Italy: The Case of Popular and Second-Chance Schools and Interventions in Lombardy

Authors: Valeria Cotza

Abstract:

School drop-out is a multifactorial phenomenon that in Italy concerns all those underage students who, at different school stages (up to 16 years old) or training (up to 18 years old), manifest educational difficulties from dropping out of compulsory education without obtaining a qualification to repetition rates and absenteeism. From the 1980s to the 2000s, there was a progressive attenuation of the economic and social model towards a multifactorial reading of the phenomenon, and the European Commission noted the importance of learning about the phenomenon through approaches able to integrate large-scale quantitative surveys with qualitative analyses. It is not a matter of identifying the contextual factors affecting the phenomenon but problematising them by means of systemic and comprehensive in-depth analysis. So, a privileged point of observation and field of intervention are those schools that propose alternative models of teaching and learning to the traditional ones, such as popular and second-chance schools. Alternative schools and interventions grew in these years in Europe as well as in the US and Latin America, working in the direction of greater equity to create the conditions (often absent in conventional schools) for everyone to achieve educational goals. Against extensive Anglo-Saxon and US literature on this topic, there is yet no unambiguous definition of alternative education, especially in Europe, where second-chance education has been most studied. There is little literature on a second chance in Italy and almost none on alternative education (with the exception of method schools, to which in Italy the concept of “alternative” is linked). This research aims to fill the gap by systematically surveying the alternative interventions in the area and beginning to explore some models of popular and second-chance schools and experiences through a mixed methods approach. So, the main research objectives concern the spread of alternative education in the Lombardy region, the main characteristics of these schools and interventions, and their effectiveness in terms of students’ well-being and school results. This paper seeks to answer the first point by presenting the preliminary results of the first phase of the project dedicated to mapping. Through the Google Forms platform, a questionnaire is being distributed to all schools in Lombardy and some schools in the rest of Italy to map the presence of alternative schools and interventions and their main characteristics. The distribution is also taking place thanks to the support of the Milan Territorial and Lombardy Regional School Offices. Moreover, other social realities outside the school system (such as cooperatives and cultural associations) can be questioned. The schools and other realities to be questioned outside Lombardy will also be identified with the support of INDIRE (Istituto Nazionale per Documentazione, Innovazione e Ricerca Educativa, “National Institute for Documentation, Innovation and Educational Research”) and based on existing literature and the indicators of “Futura” Plan of the PNRR (Piano Nazionale di Ripresa e Resilienza, “National Recovery and Resilience Plan”). Mapping will be crucial and functional for the subsequent qualitative and quantitative phase, which will make use of statistical analysis and constructivist grounded theory.

Keywords: school drop-out, alternative education, popular and second-chance schools, map

Procedia PDF Downloads 82
10182 Design Guidelines for URM Infills and Effect of Construction Sequence on Seismic Performance of Code Compliant RC Frame Buildings

Authors: Putul Haldar, Yogendra Singh, D. K. Paul

Abstract:

Un-Reinforced Masonry (URM) infilled RC framed buildings are the most common construction practice for modern multi-storey buildings in India like many other parts of the world. Although the behavior and failure pattern of the global structure changes significantly due to infill-frame interaction, the general design practice is to treat them as non-structural elements and their stiffness, strength and interaction with frame is often ignored, as it is difficult to simulate. Indian Standard, like many other major national codes, does not provide any explicit guideline for modeling of infills. This paper takes a stock of controlling design provisions in some of the major national seismic design codes (BIS 2002; CEN 2004; NZS-4230 2004; ASCE-41 2007) to ensure the desired seismic performance of infilled frame. Most of the national codes on seismic design of buildings still lack in adequate guidelines on modeling and design of URM infilled frames results in variable assumption in analysis and design. This paper, using nonlinear pushover analysis, also presents the effect of one of such assumptions of conventional ‘simultaneous’ analysis procedure of infilled frame on the seismic performance of URM infilled RC frame buildings.

Keywords: URM infills, RC frame, seismic design codes, construction sequence of infilled frame

Procedia PDF Downloads 387
10181 Models of Environmental, Crack Propagation of Some Aluminium Alloys (7xxx)

Authors: H. A. Jawan

Abstract:

This review describes the models of environmental-related crack propagation of aluminum alloys (7xxx) during the last few decades. Acknowledge on effects of different factors on the susceptibility to SCC permits to propose valuable mechanisms on crack advancement. The reliable mechanism of cracking give a possibility to propose the optimum chemical composition and thermal treatment conditions resulting in microstructure the most suitable for real environmental condition and stress state.

Keywords: microstructure, environmental, propagation, mechanism

Procedia PDF Downloads 417
10180 Thoughts on the Informatization Technology Innovation of Cores and Samples in China

Authors: Honggang Qu, Rongmei Liu, Bin Wang, Yong Xu, Zhenji Gao

Abstract:

There is a big gap in the ability and level of the informatization technology innovation of cores and samples compared with developed countries. Under the current background of promoting the technology innovation, how to strengthen the informatization technology innovation of cores and samples for National Cores and Samples Archives, which is a national innovation research center, is an important research topic. The paper summarizes the development status of cores and samples informatization technology, and finds the gaps and deficiencies, and proposes the innovation research directions and content, including data extraction, recognition, processing, integration, application and so on, so as to provide some reference and guidance for the future innovation research of the archives and support better the geological technology innovation in China.

Keywords: cores and samples;, informatization technology;, innovation;, suggestion

Procedia PDF Downloads 124
10179 Application of the Micropolar Beam Theory for the Construction of the Discrete-Continual Model of Carbon Nanotubes

Authors: Samvel H. Sargsyan

Abstract:

Together with the study of electron-optical properties of nanostructures and proceeding from experiment-based data, the study of the mechanical properties of nanostructures has become quite actual. For the study of the mechanical properties of fullerene, carbon nanotubes, graphene and other nanostructures one of the crucial issues is the construction of their adequate mathematical models. Among all mathematical models of graphene or carbon nano-tubes, this so-called discrete-continuous model is specifically important. It substitutes the interactions between atoms by elastic beams or springs. The present paper demonstrates the construction of the discrete-continual beam model for carbon nanotubes or graphene, where the micropolar beam model based on the theory of moment elasticity is accepted. With the account of the energy balance principle, the elastic moment constants for the beam model, expressed by the physical and geometrical parameters of carbon nanotube or graphene, are determined. By switching from discrete-continual beam model to the continual, the models of micropolar elastic cylindrical shell and micropolar elastic plate are confirmed as continual models for carbon nanotube and graphene respectively.

Keywords: carbon nanotube, discrete-continual, elastic, graphene, micropolar, plate, shell

Procedia PDF Downloads 158
10178 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 149
10177 Modeling and Simulation Methods Using MATLAB/Simulink

Authors: Jamuna Konda, Umamaheswara Reddy Karumuri, Sriramya Muthugi, Varun Pishati, Ravi Shakya,

Abstract:

This paper investigates the challenges involved in mathematical modeling of plant simulation models ensuring the performance of the plant models much closer to the real time physical model. The paper includes the analysis performed and investigation on different methods of modeling, design and development for plant model. Issues which impact the design time, model accuracy as real time model, tool dependence are analyzed. The real time hardware plant would be a combination of multiple physical models. It is more challenging to test the complete system with all possible test scenarios. There are possibilities of failure or damage of the system due to any unwanted test execution on real time.

Keywords: model based design (MBD), MATLAB, Simulink, stateflow, plant model, real time model, real-time workshop (RTW), target language compiler (TLC)

Procedia PDF Downloads 341
10176 Application of Human Biomonitoring and Physiologically-Based Pharmacokinetic Modelling to Quantify Exposure to Selected Toxic Elements in Soil

Authors: Eric Dede, Marcus Tindall, John W. Cherrie, Steve Hankin, Christopher Collins

Abstract:

Current exposure models used in contaminated land risk assessment are highly conservative. Use of these models may lead to over-estimation of actual exposures, possibly resulting in negative financial implications due to un-necessary remediation. Thus, we are carrying out a study seeking to improve our understanding of human exposure to selected toxic elements in soil: arsenic (As), cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb) resulting from allotment land-use. The study employs biomonitoring and physiologically-based pharmacokinetic (PBPK) modelling to quantify human exposure to these elements. We recruited 37 allotment users (adults > 18 years old) in Scotland, UK, to participate in the study. Concentrations of the elements (and their bioaccessibility) were measured in allotment samples (soil and allotment produce). Amount of produce consumed by the participants and participants’ biological samples (urine and blood) were collected for up to 12 consecutive months. Ethical approval was granted by the University of Reading Research Ethics Committee. PBPK models (coded in MATLAB) were used to estimate the distribution and accumulation of the elements in key body compartments, thus indicating the internal body burden. Simulating low element intake (based on estimated ‘doses’ from produce consumption records), predictive models suggested that detection of these elements in urine and blood was possible within a given period of time following exposure. This information was used in planning biomonitoring, and is currently being used in the interpretation of test results from biological samples. Evaluation of the models is being carried out using biomonitoring data, by comparing model predicted concentrations and measured biomarker concentrations. The PBPK models will be used to generate bioavailability values, which could be incorporated in contaminated land exposure models. Thus, the findings from this study will promote a more sustainable approach to contaminated land management.

Keywords: biomonitoring, exposure, PBPK modelling, toxic elements

Procedia PDF Downloads 319
10175 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution

Procedia PDF Downloads 354
10174 Some Aspects of Improving Service Sphere Management in Georgia

Authors: Gechbaia Badri

Abstract:

In the article, it is studied and realized the perfection issues of service sphere management in Georgia’s reality. As stated above, to transfer the country's economy onto marketing relationships, to form competitive dynamic market is dictated by the time and represents objective necessity. In the last period, the abruptly increasing of changes on science and education caused servicing sphere and producing skills, consumptions based on fields of places and changing role in a structure of the national economy. The main recourse in the new system of the economy became the intellectual capital. The economical progress is significantly determined by developing informational technologies. In the article, it is investigated the service problems of different fields of national economy and are given sentences to settle these problems.

Keywords: service management, service, paradigm, business and management engineering

Procedia PDF Downloads 415
10173 The Potential Threat of Cyberterrorism to the National Security: Theoretical Framework

Authors: Abdulrahman S. Alqahtani

Abstract:

The revolution of computing and networks could revolutionise terrorism in the same way that it has brought about changes in other aspects of life. The modern technological era has faced countries with a new set of security challenges. There are many states and potential adversaries who have the potential and capacity in cyberspace, which makes them able to carry out cyber-attacks in the future. Some of them are currently conducting surveillance, gathering and analysis of technical information, and mapping of networks and nodes and infrastructure of opponents, which may be exploited in future conflicts. This poster presents the results of the quantitative study (survey) to test the validity of the proposed theoretical framework for the cyber terrorist threats. This theoretical framework will help to in-depth understand these new digital terrorist threats. It may also be a practical guide for managers and technicians in critical infrastructure, to understand and assess the threats they face. It might also be the foundation for building a national strategy to counter cyberterrorism. In the beginning, it provides basic information about the data. To purify the data, reliability and exploratory factor analysis, as well as confirmatory factor analysis (CFA) were performed. Then, Structural Equation Modelling (SEM) was utilised to test the final model of the theory and to assess the overall goodness-of-fit between the proposed model and the collected data set.

Keywords: cyberterrorism, critical infrastructure, , national security, theoretical framework, terrorism

Procedia PDF Downloads 403
10172 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4

Authors: Ryan A. Black, Stacey A. McCaffrey

Abstract:

Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.

Keywords: instrument development, item response theory, latent trait theory, psychometrics

Procedia PDF Downloads 356
10171 Mathematical Modeling of the Working Principle of Gravity Gradient Instrument

Authors: Danni Cong, Meiping Wu, Hua Mu, Xiaofeng He, Junxiang Lian, Juliang Cao, Shaokun Cai, Hao Qin

Abstract:

Gravity field is of great significance in geoscience, national economy and national security, and gravitational gradient measurement has been extensively studied due to its higher accuracy than gravity measurement. Gravity gradient sensor, being one of core devices of the gravity gradient instrument, plays a key role in measuring accuracy. Therefore, this paper starts from analyzing the working principle of the gravity gradient sensor by Newton’s law, and then considers the relative motion between inertial and non-inertial systems to build a relatively adequate mathematical model, laying a foundation for the measurement error calibration, measurement accuracy improvement.

Keywords: gravity gradient, gravity gradient sensor, accelerometer, single-axis rotation modulation

Procedia PDF Downloads 325
10170 D-Wave Quantum Computing Ising Model: A Case Study for Forecasting of Heat Waves

Authors: Dmytro Zubov, Francesco Volponi

Abstract:

In this paper, D-Wave quantum computing Ising model is used for the forecasting of positive extremes of daily mean air temperature. Forecast models are designed with two to five qubits, which represent 2-, 3-, 4-, and 5-day historical data respectively. Ising model’s real-valued weights and dimensionless coefficients are calculated using daily mean air temperatures from 119 places around the world, as well as sea level (Aburatsu, Japan). In comparison with current methods, this approach is better suited to predict heat wave values because it does not require the estimation of a probability distribution from scarce observations. Proposed forecast quantum computing algorithm is simulated based on traditional computer architecture and combinatorial optimization of Ising model parameters for the Ronald Reagan Washington National Airport dataset with 1-day lead-time on learning sample (1975-2010 yr). Analysis of the forecast accuracy (ratio of successful predictions to total number of predictions) on the validation sample (2011-2014 yr) shows that Ising model with three qubits has 100 % accuracy, which is quite significant as compared to other methods. However, number of identified heat waves is small (only one out of nineteen in this case). Other models with 2, 4, and 5 qubits have 20 %, 3.8 %, and 3.8 % accuracy respectively. Presented three-qubit forecast model is applied for prediction of heat waves at other five locations: Aurel Vlaicu, Romania – accuracy is 28.6 %; Bratislava, Slovakia – accuracy is 21.7 %; Brussels, Belgium – accuracy is 33.3 %; Sofia, Bulgaria – accuracy is 50 %; Akhisar, Turkey – accuracy is 21.4 %. These predictions are not ideal, but not zeros. They can be used independently or together with other predictions generated by different method(s). The loss of human life, as well as environmental, economic, and material damage, from extreme air temperatures could be reduced if some of heat waves are predicted. Even a small success rate implies a large socio-economic benefit.

Keywords: heat wave, D-wave, forecast, Ising model, quantum computing

Procedia PDF Downloads 495
10169 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 408
10168 Slavery Transcending Borders: An Analysis of Human Trafficking in Europe and the EU’s Impact on the Issue

Authors: Santiago Martínez Hernández

Abstract:

The establishment of the European Union signified the culmination of the supra-national power addressing economic, political, legal and humanitarian matters within and above a national territory. Human rights have taken a protagonist role as one of the pressing concerns that the EU addresses, and one of the most critical problems is that of human trafficking. This multi-billion dollar criminal business represents $31.6 per year made out of 2.5 million trafficked persons worldwide, making it one of the most crucial human rights problems in the world to address. The EU has developed strategies to tackle this issue through supra-national governance, however, how have they fared? What is the impact of its development on the issue? This paper will address the direct and indirect impact of the formation of the European Union as a supranational political and economic entity on the illicit industry of human trafficking in Europe. It attempts to analyse first, the situation of human trafficking in Europe, as an attempt to understand its importance in the region, addressing its root causes and the role of the states addressed. Second, the paper will examine the impact of the EU on human breaking down its policy-making at a supranational level, the role of the economic integration of the region, and the change of migration patterns since its inception.

Keywords: human trafficking, human rights, European union, criminal business

Procedia PDF Downloads 358
10167 An Evaluation of Education Provision for Students with Autism Spectrum Disorder in Ireland: The Role of the Special Needs Assistant

Authors: Claire P. Griffin

Abstract:

The education provision for students with special educational needs, including students with Autism Spectrum Disorder (ASD), has undergone significant national and international changes in recent years. In particular, an increase in resource-based provision has occurred across educational settings in an effort to support inclusive practices. This paper seeks to explore the role of the Special Needs Assistant (SNA) in supporting children with ASD in Irish schools. This research stems from the second national evaluation of ‘Education Provision for Students with Autism Spectrum Disorder in Ireland’ (NCSE, 2016). This research was commissioned by the National Council for Special Education (NCSE) in Ireland and conducted by a team of researchers from Mary Immaculate College, Limerick from February to July 2014. This study involved a multiple case study research strategy across 24 educational sites, as selected through a stratified sampling process. Research strategies included semi-structured interviews, classroom observations, documentary review and child conversations. Data analysis was conducted electronically using Nvivo software, with use of an additional quantitative recording mechanism based on scaled weighting criteria for collected data. Based on such information, key findings from the NCSE national evaluation will be presented and critically reviewed, with particular reference to the role of the SNA in supporting pupils with ASD. Examples of positive practice inherent within the SNA role will be outlined and contrasted with discrete areas for development. Based on such findings, recommendations for the evolving role of the SNA will be presented, with the aim of informing both policy and best practice within the field.

Keywords: autism spectrum disorder, inclusive education , paraprofessional, special needs assistant

Procedia PDF Downloads 277
10166 Behavior of Steel Moment Frames Subjected to Impact Load

Authors: Hyungoo Kang, Minsung Kim, Jinkoo Kim

Abstract:

This study investigates the performance of a 2D and 3D steel moment frame subjected to vehicle collision at a first story column using LS-DYNA. The finite element models of vehicles provided by the National Crash Analysis Center (NCAC) are used for numerical analysis. Nonlinear dynamic time history analysis of the 2D and 3D model structures are carried out based on the arbitrary column removal scenario, and the vertical displacement of the damaged structures are compared with that obtained from collision analysis. The analysis results show that the model structure remains stable when the speed of the vehicle is 40km/h. However, at the speed of 80 and 120km/h both the 2D and 3D structures collapse by progressive collapse. The vertical displacement of the damaged joint obtained from collision analysis is significantly larger than the displacement computed based on the arbitrary column removal scenario.

Keywords: vehicle collision, progressive collapse, FEM, LS-DYNA

Procedia PDF Downloads 340
10165 Adaptation of Requirement Engineering Practices in Pakistan

Authors: Waqas Ali, Nadeem Majeed

Abstract:

Requirement engineering is an essence of software development life cycle. The more time we spend on requirement engineering, higher the probability of success. Effective requirement engineering ensures and predicts successful software product. This paper presents the adaptation of requirement engineering practices in small and medium size companies of Pakistan. The study is conducted by questionnaires to show how much of requirement engineering models and practices are followed in Pakistan.

Keywords: requirement engineering, Pakistan, models, practices, organizations

Procedia PDF Downloads 717
10164 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study

Authors: Natália Botica, Luís Luís, Paulo Bernardes

Abstract:

The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.

Keywords: rock art, archaeology, iron age, 3D models

Procedia PDF Downloads 81
10163 Local People’s Livelihoods and Coping Strategies in the Wake of a Co-management System in the Campo Ma'an National Park, Cameroon

Authors: Nchanji Yvonne Kiki, Mala William Armand, Nchanji Eileen Bogweh, Ramcilovik-Suominen Sabaheta, Kotilainen Juha

Abstract:

The Campo Ma'an National Park was created as part of an environmental and biodiversity compensation for the Chad-Cameroon Oil Pipeline Project, which was meant to help alleviate poverty and boost the livelihood of rural communities around the area. This paper examines different strategies and coping mechanisms employed by the indigenous people and local communities to deal with the national and internationally driven conservation policies and initiatives in the case of the Campo Ma'an National Park. While most literature on park management/co-management/nature conservation has focused on the negative implications for local peoples’ livelihoods, fewer studies have investigated the strategies of local people to respond to these policies and renegotiate their position in a way that enables them to continue their traditional livelihoods using the existing local knowledge systems. This study contributes to the current literature by zooming into not only the impacts of nature conservation policies but also the local individual and collective strategies and responses to such policies and initiatives. We employ a qualitative research approach using ethnomethodology and a convivial lens to analyze data collected from October to November 2018. We find that conservation policies have worsened some existing livelihoods on the one hand and constrained livelihood improvement of indigenous people and local communities (IPLC) on the other hand. Nonetheless, the IPLC has devised individual and collective coping mechanisms to deal with these conservation interventions and the negative effects they have caused. Upon exploring these mechanisms and their effectiveness, this study proposes a management approach to conservation centered on both people and nature, based on indigenous and local people's knowledge and practices, promoting nature for and by humans and strengthening both livelihood and conservation. We take inspiration from the convivial conservation approach and thinking by Bucher and Fletcher.

Keywords: conservation policies, national park management, indigenous and local people’s experiences, livelihoods, local knowledge, coping strategies, conviviality

Procedia PDF Downloads 180
10162 Models of Environmental: Cracker Propagation of Some Aluminum Alloys (7xxx)

Authors: H. Jawan

Abstract:

This review describes the models of environmental-related crack propagation of aluminum alloys (7xxx) during the last few decades. Acknowledge on effects of different factors on the susceptibility to SCC permits to propose valuable mechanisms on crack advancement. The reliable mechanism of cracking give a possibility to propose the optimum chemical composition and thermal treatment conditions resulting in microstructure the most suitable for real environmental condition and stress state.

Keywords: microstructure, environmental, propagation, mechanism

Procedia PDF Downloads 388
10161 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images

Authors: Masood Varshosaz, Kamyar Hasanpour

Abstract:

In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.

Keywords: human recognition, deep learning, drones, disaster mitigation

Procedia PDF Downloads 91
10160 Review of Concepts and Tools Applied to Assess Risks Associated with Food Imports

Authors: A. Falenski, A. Kaesbohrer, M. Filter

Abstract:

Introduction: Risk assessments can be performed in various ways and in different degrees of complexity. In order to assess risks associated with imported foods additional information needs to be taken into account compared to a risk assessment on regional products. The present review is an overview on currently available best practise approaches and data sources used for food import risk assessments (IRAs). Methods: A literature review has been performed. PubMed was searched for articles about food IRAs published in the years 2004 to 2014 (English and German texts only, search string “(English [la] OR German [la]) (2004:2014 [dp]) import [ti] risk”). Titles and abstracts were screened for import risks in the context of IRAs. The finally selected publications were analysed according to a predefined questionnaire extracting the following information: risk assessment guidelines followed, modelling methods used, data and software applied, existence of an analysis of uncertainty and variability. IRAs cited in these publications were also included in the analysis. Results: The PubMed search resulted in 49 publications, 17 of which contained information about import risks and risk assessments. Within these 19 cross references were identified to be of interest for the present study. These included original articles, reviews and guidelines. At least one of the guidelines of the World Organisation for Animal Health (OIE) and the Codex Alimentarius Commission were referenced in any of the IRAs, either for import of animals or for imports concerning foods, respectively. Interestingly, also a combination of both was used to assess the risk associated with the import of live animals serving as the source of food. Methods ranged from full quantitative IRAs using probabilistic models and dose-response models to qualitative IRA in which decision trees or severity tables were set up using parameter estimations based on expert opinions. Calculations were done using @Risk, R or Excel. Most heterogeneous was the type of data used, ranging from general information on imported goods (food, live animals) to pathogen prevalence in the country of origin. These data were either publicly available in databases or lists (e.g., OIE WAHID and Handystatus II, FAOSTAT, Eurostat, TRACES), accessible on a national level (e.g., herd information) or only open to a small group of people (flight passenger import data at national airport customs office). In the IRAs, an uncertainty analysis has been mentioned in some cases, but calculations have been performed only in a few cases. Conclusion: The current state-of-the-art in the assessment of risks of imported foods is characterized by a great heterogeneity in relation to general methodology and data used. Often information is gathered on a case-by-case basis and reformatted by hand in order to perform the IRA. This analysis therefore illustrates the need for a flexible, modular framework supporting the connection of existing data sources with data analysis and modelling tools. Such an infrastructure could pave the way to IRA workflows applicable ad-hoc, e.g. in case of a crisis situation.

Keywords: import risk assessment, review, tools, food import

Procedia PDF Downloads 301