Search results for: model based clustering
36946 Investment Adjustments to Exchange Rate Fluctuations Evidence from Manufacturing Firms in Tunisia
Authors: Mourad Zmami Oussema BenSalha
Abstract:
The current research aims to assess empirically the reaction of private investment to exchange rate fluctuations in Tunisia using a sample of 548 firms operating in manufacturing industries between 1997 and 2002. The micro-econometric model we estimate is based on an accelerator-profit specification investment model increased by two variables that measure the variation and the volatility of exchange rates. Estimates using the system the GMM method reveal that the effects of the exchange rate depreciation on investment are negative since it increases the cost of imported capital goods. Turning to the exchange rate volatility, as measured by the GARCH (1,1) model, our findings assign a significant role to the exchange rate uncertainty in explaining the sluggishness of private investment in Tunisia in the full sample of firms. Other estimation attempts based on various sub samples indicate that the elasticities of investment relative to the exchange rate volatility depend upon many firms’ specific characteristics such as the size and the ownership structure.Keywords: investment, exchange rate volatility, manufacturing firms, system GMM, Tunisia
Procedia PDF Downloads 41036945 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics
Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo
Abstract:
Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model
Procedia PDF Downloads 15536944 A Robust Optimization Model for Multi-Objective Closed-Loop Supply Chain
Authors: Mohammad Y. Badiee, Saeed Golestani, Mir Saman Pishvaee
Abstract:
In recent years consumers and governments have been pushing companies to design their activities in such a way as to reduce negative environmental impacts by producing renewable product or threat free disposal policy more and more. It is therefore important to focus more accurate to the optimization of various aspect of total supply chain. Modeling a supply chain can be a challenging process due to the fact that there are a large number of factors that need to be considered in the model. The use of multi-objective optimization can lead to overcome those problems since more information is used when designing the model. Uncertainty is inevitable in real world. Considering uncertainty on parameters in addition to use multi-objectives are ways to give more flexibility to the decision making process since the process can take into account much more constraints and requirements. In this paper we demonstrate a stochastic scenario based robust model to cope with uncertainty in a closed-loop multi-objective supply chain. By applying the proposed model in a real world case, the power of proposed model in handling data uncertainty is shown.Keywords: supply chain management, closed-loop supply chain, multi-objective optimization, goal programming, uncertainty, robust optimization
Procedia PDF Downloads 41536943 A Case Study on Smart Energy City of the UK: Based on Business Model Innovation
Authors: Minzheong Song
Abstract:
The purpose of this paper is to see a case of smart energy evolution of the UK along with government projects and smart city project like 'Smart London Plan (SLP)' in 2013 with the logic of business model innovation (BMI). For this, it discusses the theoretical logic and formulates a research framework of evolving smart energy from silo to integrated system. The starting point is the silo system with no connection and in second stage, the private investment in smart meters, smart grids implementation, energy and water nexus, adaptive smart grid systems, and building marketplaces with platform leadership. As results, the UK’s smart energy sector has evolved from smart meter device installation through smart grid to new business models such as water-energy nexus and microgrid service within the smart energy city system.Keywords: smart city, smart energy, business model, business model innovation (BMI)
Procedia PDF Downloads 16136942 Destination Decision Model for Cruising Taxis Based on Embedding Model
Authors: Kazuki Kamada, Haruka Yamashita
Abstract:
In Japan, taxi is one of the popular transportations and taxi industry is one of the big businesses. However, in recent years, there has been a difficult problem of reducing the number of taxi drivers. In the taxi business, mainly three passenger catching methods are applied. One style is "cruising" that drivers catches passengers while driving on a road. Second is "waiting" that waits passengers near by the places with many requirements for taxies such as entrances of hospitals, train stations. The third one is "dispatching" that is allocated based on the contact from the taxi company. Above all, the cruising taxi drivers need the experience and intuition for finding passengers, and it is difficult to decide "the destination for cruising". The strong recommendation system for the cruising taxies supports the new drivers to find passengers, and it can be the solution for the decreasing the number of drivers in the taxi industry. In this research, we propose a method of recommending a destination for cruising taxi drivers. On the other hand, as a machine learning technique, the embedding models that embed the high dimensional data to a low dimensional space is widely used for the data analysis, in order to represent the relationship of the meaning between the data clearly. Taxi drivers have their favorite courses based on their experiences, and the courses are different for each driver. We assume that the course of cruising taxies has meaning such as the course for finding business man passengers (go around the business area of the city of go to main stations) and course for finding traveler passengers (go around the sightseeing places or big hotels), and extract the meaning of their destinations. We analyze the cruising history data of taxis based on the embedding model and propose the recommendation system for passengers. Finally, we demonstrate the recommendation of destinations for cruising taxi drivers based on the real-world data analysis using proposing method.Keywords: taxi industry, decision making, recommendation system, embedding model
Procedia PDF Downloads 13836941 Fetal Movement Study Using Biomimics of the Maternal March
Authors: V. Diaz, B. Pardo , D. Villegas
Abstract:
In premature births most babies have complications at birth, these complications can be reduced, if an atmosphere of relaxation is provided and is also similar to intrauterine life, for this, there are programs where their mothers lull and sway them; however, the conditions in which they do so and the way in they do it may not be the indicated. Here we describe an investigation based on the biomimics of the kinematics of human fetal movement, which consists of determining the movements that the fetus experiences and the deformations of the components that surround the fetus during a gentle walk at week 32 of the gestation stage. This research is based on a 3D model that has the anatomical structure of the pelvis, fetus, muscles, uterus and its most important supporting elements (ligaments). Normal load conditions are applied to this model according to the stage of gestation and the kinematics of a gentle walk of a pregnant mother, which focuses on the pelvic bone, this allows to receive a response from the other elements of the model. To accomplish this modeling and subsequent simulation Solidworks software was used. From this analysis, the curves that describe the movement of the fetus at three different points were obtained. Additionally, we could found the deformation of the uterus and the ligaments that support it, showing the characteristics that these tissues can have in the face of the support of the fetus. These data can be used for the construction of artifacts that help the normal development of premature infants.Keywords: simulation, biomimic, uterine model, fetal movement study
Procedia PDF Downloads 16536940 Modeling Influence on Petty Corruption Attitudes
Authors: Nina Bijedic, Drazena Gaspar, Mirsad Hadzikadic
Abstract:
Corruption is an influential and widespread problem. One part of it is so-called petty corruption, related to large-scale bribe giving by ordinary citizens trying to influence the works of public administration or public services. As it is with all means of corruption, petty corruption is related to the level of democracy (or administration efficiency) in a society. The developed model captures some of the factors related to corruptive behavior, as well as people’s attitude towards petty corruption. It has four basic elements: user’s perception of corruption in the society of interest, the influence of social interactions, the influence of penalizing mechanism, and influence of campaigns against petty corruption. The model is agent-based, developed in NetLogo, with a lot of random settings that provide a wider scope of responses. Interactions of different settings for variables of elements provide insight into the influence of each element on attitude towards petty corruption, as well as petty corruptive behavior.Keywords: agent-based model, attitude, influence, petty corruption, society
Procedia PDF Downloads 19936939 Parameter Estimation of Additive Genetic and Unique Environment (AE) Model on Diabetes Mellitus Type 2 Using Bayesian Method
Authors: Andi Darmawan, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Diabetes mellitus (DM) is a chronic disease in human that occurred if pancreas cannot produce enough of insulin hormone or the body uses ineffectively insulin hormone which causes increasing level of glucose in the blood, or it was called hyperglycemia. In Indonesia, DM is a serious disease on health because it can cause blindness, kidney disease, diabetic feet (gangrene), and stroke. The type of DM criteria can also be divided based on the main causes; they are DM type 1, type 2, and gestational. Diabetes type 1 or previously known as insulin-independent diabetes is due to a lack of production of insulin hormone. Diabetes type 2 or previously known as non-insulin dependent diabetes is due to ineffective use of insulin while gestational diabetes is a hyperglycemia that found during pregnancy. The most one type commonly found in patient is DM type 2. The main factors of this disease are genetic (A) and life style (E). Those disease with 2 factors can be constructed with additive genetic and unique environment (AE) model. In this article was discussed parameter estimation of AE model using Bayesian method and the inheritance character simulation on parent-offspring. On the AE model, there are response variable, predictor variables, and parameters were capable of representing the number of population on research. The population can be measured through a taken random sample. The response and predictor variables can be determined by sample while the parameters are unknown, so it was required to estimate the parameters based on the sample. Estimation of AE model parameters was obtained based on a joint posterior distribution. The simulation was conducted to get the value of genetic variance and life style variance. The results of simulation are 0.3600 for genetic variance and 0.0899 for life style variance. Therefore, the variance of genetic factor in DM type 2 is greater than life style.Keywords: AE model, Bayesian method, diabetes mellitus type 2, genetic, life style
Procedia PDF Downloads 28436938 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 8636937 Theoretical Framework for Value Creation in Project Oriented Companies
Authors: Mariusz Hofman
Abstract:
The paper ‘Theoretical framework for value creation in Project-Oriented Companies’ is designed to determine, how organisations create value and whether this allows them to achieve market success. An assumption has been made that there are two routes to achieving this value. The first one is to create intangible assets (i.e. the resources of human, structural and relational capital), while the other one is to create added value (understood as the surplus of revenue over costs). It has also been assumed that the combination of the achieved added value and unique intangible assets translates to the success of a project-oriented company. The purpose of the paper is to present hypothetical and deductive model which describing the modus operandi of such companies and approach to model operationalisation. All the latent variables included in the model are theoretical constructs with observational indicators (measures). The existence of a latent variable (construct) and also submodels will be confirmed based on a covariance matrix which in turn is based on empirical data, being a set of observational indicators (measures). This will be achieved with a confirmatory factor analysis (CFA). Due to this statistical procedure, it will be verified whether the matrix arising from the adopted theoretical model differs statistically from the empirical matrix of covariance arising from the system of equations. The fit of the model with the empirical data will be evaluated using χ2, RMSEA and CFI (Comparative Fit Index). How well the theoretical model fits the empirical data is assessed through a number of indicators. If the theoretical conjectures are confirmed, an interesting development path can be defined for project-oriented companies. This will let such organisations perform efficiently in the face of the growing competition and pressure on innovation.Keywords: value creation, project-oriented company, structural equation modelling
Procedia PDF Downloads 29736936 Air Quality Forecast Based on Principal Component Analysis-Genetic Algorithm and Back Propagation Model
Authors: Bin Mu, Site Li, Shijin Yuan
Abstract:
Under the circumstance of environment deterioration, people are increasingly concerned about the quality of the environment, especially air quality. As a result, it is of great value to give accurate and timely forecast of AQI (air quality index). In order to simplify influencing factors of air quality in a city, and forecast the city’s AQI tomorrow, this study used MATLAB software and adopted the method of constructing a mathematic model of PCA-GABP to provide a solution. To be specific, this study firstly made principal component analysis (PCA) of influencing factors of AQI tomorrow including aspects of weather, industry waste gas and IAQI data today. Then, we used the back propagation neural network model (BP), which is optimized by genetic algorithm (GA), to give forecast of AQI tomorrow. In order to verify validity and accuracy of PCA-GABP model’s forecast capability. The study uses two statistical indices to evaluate AQI forecast results (normalized mean square error and fractional bias). Eventually, this study reduces mean square error by optimizing individual gene structure in genetic algorithm and adjusting the parameters of back propagation model. To conclude, the performance of the model to forecast AQI is comparatively convincing and the model is expected to take positive effect in AQI forecast in the future.Keywords: AQI forecast, principal component analysis, genetic algorithm, back propagation neural network model
Procedia PDF Downloads 22736935 Computer Based Model for Collaborative Research as a Panacea for National Development in Third World Countries
Authors: M. A. Rahman, A. O. Enikuomehin
Abstract:
Sharing commitment to reach a common goal in research by harnessing available resources from two or more parties can simply be referred to as collaborative research. Asides from avoiding duplication of research, the benefits often accrued from such research alliances include time economy as well as expenses reduction in completing such studies. Likewise, it provides an avenue to produce a wider horizon of scientific knowledge sequel to gathering of skills, knowledge and resources. In institutions of higher learning and research institutes, it often gives scholars an opportunity to strengthen the teaching and research capacity of their various institutions. Between industries and institutions, collaborative research breeds promising relationship that could be geared towards addressing different research problems such as producing and enhancing industrial-based products and services, including technological transfer. For Nigeria to take advantage of this collaboration, different issues like licensing of technology, intellectual property right, confidentiality, and funding among others, which could arise during this collaborative research programme, are identified in this paper. An important tool required to achieve this height in developing economy is the use of appropriate computer model. The paper highlights the costs of the collaborations and likewise stresses the need for evaluating the effectiveness and efficiency of such collaborative research activities and proposes an appropriate computer model to assist in this regard.Keywords: collaborative research, developing country, computerization, model
Procedia PDF Downloads 33236934 An Output Oriented Super-Efficiency Model for Considering Time Lag Effect
Authors: Yanshuang Zhang, Byungho Jeong
Abstract:
There exists some time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in calculating efficiency of decision making units (DMU). Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. This problem can be resolved a super-efficiency model. However, a super efficiency model sometimes causes infeasibility problem. This paper suggests an output oriented super-efficiency model for efficiency evaluation under the consideration of time lag effect. A case example using a long term research project is given to compare the suggested model with the MpO modelKeywords: DEA, Super-efficiency, Time Lag, research activities
Procedia PDF Downloads 65736933 Numerical Analysis of 3D Electromagnetic Fields in Annular Induction Plasma
Authors: Abderazak Guettaf
Abstract:
The mathematical models of the physical phenomena interacting in inductive plasma were described by the physics equations of the continuous mediums. A 3D model based on magnetic potential vector and electric scalar potential (A, V) formulation is used. The finished volume method is applied to electromagnetic equation, to obtain the field distribution inside the plasma. The numerical results of the method developed on a basic model designed starting from a real three-dimensional model were exposed. From the mathematical model 3D spreading assumptions and boundary conditions, we evaluated the electric field in the load and we have developed a numerical code made under the MATLAB environment, all verifying the effectiveness and validity of this code.Keywords: electric field, 3D magnetic potential vector and electric scalar potential (A, V) formulation, finished volumes, annular plasma
Procedia PDF Downloads 49136932 Modal Analysis of Small Frames using High Order Timoshenko Beams
Authors: Chadi Azoury, Assad Kallassy, Pierre Rahme
Abstract:
In this paper, we consider the modal analysis of small frames. Firstly, we construct the 3D model using H8 elements and find the natural frequencies of the frame focusing our attention on the modes in the XY plane. Secondly, we construct the 2D model (plane stress model) using Q4 elements. We concluded that the results of both models are very close to each other’s. Then we formulate the stiffness matrix and the mass matrix of the 3-noded Timoshenko beam that is well suited for thick and short beams like in our case. Finally, we model the corners where the horizontal and vertical bar meet with a special matrix. The results of our new model (3-noded Timoshenko beam for the horizontal and vertical bars and a special element for the corners based on the Q4 elements) are very satisfying when performing the modal analysis.Keywords: corner element, high-order Timoshenko beam, Guyan reduction, modal analysis of frames, rigid link, shear locking, and short beams
Procedia PDF Downloads 31836931 Development and Investigation of Efficient Substrate Feeding and Dissolved Oxygen Control Algorithms for Scale-Up of Recombinant E. coli Cultivation Process
Authors: Vytautas Galvanauskas, Rimvydas Simutis, Donatas Levisauskas, Vykantas Grincas, Renaldas Urniezius
Abstract:
The paper deals with model-based development and implementation of efficient control strategies for recombinant protein synthesis in fed-batch E.coli cultivation processes. Based on experimental data, a kinetic dynamic model for cultivation process was developed. This model was used to determine substrate feeding strategies during the cultivation. The proposed feeding strategy consists of two phases – biomass growth phase and recombinant protein production phase. In the first process phase, substrate-limited process is recommended when the specific growth rate of biomass is about 90-95% of its maximum value. This ensures reduction of glucose concentration in the medium, improves process repeatability, reduces the development of secondary metabolites and other unwanted by-products. The substrate limitation can be enhanced to satisfy restriction on maximum oxygen transfer rate in the bioreactor and to guarantee necessary dissolved carbon dioxide concentration in culture media. In the recombinant protein production phase, the level of substrate limitation and specific growth rate are selected within the range to enable optimal target protein synthesis rate. To account for complex process dynamics, to efficiently exploit the oxygen transfer capability of the bioreactor, and to maintain the required dissolved oxygen concentration, adaptive control algorithms for dissolved oxygen control have been proposed. The developed model-based control strategies are useful in scale-up of cultivation processes and accelerate implementation of innovative biotechnological processes for industrial applications.Keywords: adaptive algorithms, model-based control, recombinant E. coli, scale-up of bioprocesses
Procedia PDF Downloads 25736930 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes
Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono
Abstract:
Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is a widely used approach for LV segmentation but suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is proposed to improve the accuracy and speed of the model-based segmentation. Firstly, a robust and efficient detector based on Hough forest is proposed to localize cardiac feature points, and such points are used to predict the initial fitting of the LV shape model. Secondly, to achieve more accurate and detailed segmentation, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. The performance of the proposed method is evaluated on a dataset of 800 cardiac ultrasound images that are mostly of abnormal shapes. The proposed method is compared to several combinations of ASM and existing initialization methods. The experiment results demonstrate that the accuracy of feature point detection for initialization was improved by 40% compared to the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops, thus speeding up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.Keywords: hough forest, active shape model, segmentation, cardiac left ventricle
Procedia PDF Downloads 33936929 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents
Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi
Abstract:
In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles
Procedia PDF Downloads 44436928 Validation of the Formal Model of Web Services Applications for Digital Reference Service of Library Information System
Authors: Zainab Magaji Musa, Nordin M. A. Rahman, Julaily Aida Jusoh
Abstract:
The web services applications for digital reference service (WSDRS) of LIS model is an informal model that claims to reduce the problems of digital reference services in libraries. It uses web services technology to provide efficient way of satisfying users’ needs in the reference section of libraries. The formal WSDRS model consists of the Z specifications of all the informal specifications of the model. This paper discusses the formal validation of the Z specifications of WSDRS model. The authors formally verify and thus validate the properties of the model using Z/EVES theorem prover.Keywords: validation, verification, formal, theorem prover
Procedia PDF Downloads 51536927 A Transformer-Based Approach for Multi-Human 3D Pose Estimation Using Color and Depth Images
Authors: Qiang Wang, Hongyang Yu
Abstract:
Multi-human 3D pose estimation is a challenging task in computer vision, which aims to recover the 3D joint locations of multiple people from multi-view images. In contrast to traditional methods, which typically only use color (RGB) images as input, our approach utilizes both color and depth (D) information contained in RGB-D images. We also employ a transformer-based model as the backbone of our approach, which is able to capture long-range dependencies and has been shown to perform well on various sequence modeling tasks. Our method is trained and tested on the Carnegie Mellon University (CMU) Panoptic dataset, which contains a diverse set of indoor and outdoor scenes with multiple people in varying poses and clothing. We evaluate the performance of our model on the standard 3D pose estimation metrics of mean per-joint position error (MPJPE). Our results show that the transformer-based approach outperforms traditional methods and achieves competitive results on the CMU Panoptic dataset. We also perform an ablation study to understand the impact of different design choices on the overall performance of the model. In summary, our work demonstrates the effectiveness of using a transformer-based approach with RGB-D images for multi-human 3D pose estimation and has potential applications in real-world scenarios such as human-computer interaction, robotics, and augmented reality.Keywords: multi-human 3D pose estimation, RGB-D images, transformer, 3D joint locations
Procedia PDF Downloads 8036926 Exploring the Applications of Neural Networks in the Adaptive Learning Environment
Authors: Baladitya Swaika, Rahul Khatry
Abstract:
Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.Keywords: computer adaptive tests, item response theory, machine learning, neural networks
Procedia PDF Downloads 17536925 Study and Construction on Signalling System during Reverse Motion Due to Obstacle
Authors: S. M. Yasir Arafat
Abstract:
Driving models are needed by many researchers to improve traffic safety and to advance autonomous vehicle design. To be most useful, a driving model must state specifically what information is needed and how it is processed. So we developed an “Obstacle Avoidance and Detection Autonomous Car” based on sensor application. The ever increasing technological demands of today call for very complex systems, which in turn require highly sophisticated controllers to ensure that high performance can be achieved and maintained under adverse conditions. Based on a developed model of brakes operation, the controller of braking system operation has been designed. It has a task to enable solution to the problem of the better controlling of braking system operation in a more accurate way then it was the case now a day.Keywords: automobile, obstacle, safety, sensing
Procedia PDF Downloads 36436924 Multi-Atlas Segmentation Based on Dynamic Energy Model: Application to Brain MR Images
Authors: Jie Huo, Jonathan Wu
Abstract:
Segmentation of anatomical structures in medical images is essential for scientific inquiry into the complex relationships between biological structure and clinical diagnosis, treatment and assessment. As a method of incorporating the prior knowledge and the anatomical structure similarity between a target image and atlases, multi-atlas segmentation has been successfully applied in segmenting a variety of medical images, including the brain, cardiac, and abdominal images. The basic idea of multi-atlas segmentation is to transfer the labels in atlases to the coordinate of the target image by matching the target patch to the atlas patch in the neighborhood. However, this technique is limited by the pairwise registration between target image and atlases. In this paper, a novel multi-atlas segmentation approach is proposed by introducing a dynamic energy model. First, the target is mapped to each atlas image by minimizing the dynamic energy function, then the segmentation of target image is generated by weighted fusion based on the energy. The method is tested on MICCAI 2012 Multi-Atlas Labeling Challenge dataset which includes 20 target images and 15 atlases images. The paper also analyzes the influence of different parameters of the dynamic energy model on the segmentation accuracy and measures the dice coefficient by using different feature terms with the energy model. The highest mean dice coefficient obtained with the proposed method is 0.861, which is competitive compared with the recently published method.Keywords: brain MRI segmentation, dynamic energy model, multi-atlas segmentation, energy minimization
Procedia PDF Downloads 33636923 Sentiment Classification of Documents
Authors: Swarnadip Ghosh
Abstract:
Sentiment Analysis is the process of detecting the contextual polarity of text. In other words, it determines whether a piece of writing is positive, negative or neutral.Sentiment analysis of documents holds great importance in today's world, when numerous information is stored in databases and in the world wide web. An efficient algorithm to illicit such information, would be beneficial for social, economic as well as medical purposes. In this project, we have developed an algorithm to classify a document into positive or negative. Using our algorithm, we obtained a feature set from the data, and classified the documents based on this feature set. It is important to note that, in the classification, we have not used the independence assumption, which is considered by many procedures like the Naive Bayes. This makes the algorithm more general in scope. Moreover, because of the sparsity and high dimensionality of such data, we did not use empirical distribution for estimation, but developed a method by finding degree of close clustering of the data points. We have applied our algorithm on a movie review data set obtained from IMDb and obtained satisfactory results.Keywords: sentiment, Run's Test, cross validation, higher dimensional pmf estimation
Procedia PDF Downloads 40236922 Simulation of Red Blood Cells in Complex Micro-Tubes
Authors: Ting Ye, Nhan Phan-Thien, Chwee Teck Lim, Lina Peng, Huixin Shi
Abstract:
In biofluid flow systems, often the flow problems of fluids of complex structures, such as the flow of red blood cells (RBCs) through complex capillary vessels, need to be considered. In this paper, we aim to apply a particle-based method, Smoothed Dissipative Particle Dynamics (SDPD), to simulate the motion and deformation of RBCs in complex micro-tubes. We first present the theoretical models, including SDPD model, RBC-fluid interaction model, RBC deformation model, RBC aggregation model, and boundary treatment model. After that, we show the verification and validation of these models, by comparing our numerical results with the theoretical, experimental and previously-published numerical results. Finally, we provide some simulation cases, such as the motion and deformation of RBCs in rectangular, cylinder, curved, bifurcated, and constricted micro-tubes, respectively.Keywords: aggregation, deformation, red blood cell, smoothed dissipative particle dynamics
Procedia PDF Downloads 17436921 Analysis of the Decoupling Relationship between Urban Green Development and the Level of Regional Integration Based on the Tapio Model
Authors: Ruoyu Mao
Abstract:
Exploring the relationship between urban green development and regional integration level is of great significance for realising regional high quality and sustainable development. Based on the Tapio decoupling model and the theoretical framework of urban green development and regional integration, this paper builds an analysis system, makes a quantitative analysis of urban green development and regional integration level in a certain period, and discusses the relationship between the two. It also takes China's Yangtze River Delta urban agglomeration as an example to study the degree of decoupling, the type of decoupling, and the trend of the evolution of the spatio-temporal pattern of decoupling between the level of urban green development and the level of regional integration in the period of 2014-2021, with the aim of providing a useful reference for the future development of the region.Keywords: regional integration, urban green development, Tapio decoupling model, Yangtze River Delta urban agglomeration
Procedia PDF Downloads 4336920 Analysis of Cyclic Elastic-Plastic Loading of Shaft Based on Kinematic Hardening Model
Authors: Isa Ahmadi, Ramin Khamedi
Abstract:
In this paper, the elasto-plastic and cyclic torsion of a shaft is studied using a finite element method. The Prager kinematic hardening theory of plasticity with the Ramberg and Osgood stress-strain equation is used to evaluate the cyclic loading behavior of the shaft under the torsional loading. The material of shaft is assumed to follow the non-linear strain hardening property based on the Prager model. The finite element method with C1 continuity is developed and used for solution of the governing equations of the problem. The successive substitution iterative method is used to calculate the distribution of stresses and plastic strains in the shaft due to cyclic loads. The shear stress, effective stress, residual stress and elastic and plastic shear strain distribution are presented in the numerical results.Keywords: cyclic loading, finite element analysis, Prager kinematic hardening model, torsion of shaft
Procedia PDF Downloads 40836919 Modeling and Experimental Verification of Crystal Growth Kinetics in Glass Forming Alloys
Authors: Peter K. Galenko, Stefanie Koch, Markus Rettenmayr, Robert Wonneberger, Evgeny V. Kharanzhevskiy, Maria Zamoryanskaya, Vladimir Ankudinov
Abstract:
We analyze the structure of undercooled melts, crystal growth kinetics and amorphous/crystalline microstructure of rapidly solidifying glass-forming Pd-based and CuZr-based alloys. A dendrite growth model is developed using a combination of the kinetic phase-field model and mesoscopic sharp interface model. The model predicts features of crystallization kinetics in alloys from thermodynamically controlled growth (governed by the Gibbs free energy change on solidification) to the kinetically limited regime (governed by atomic attachment-detachment processes at the solid/liquid interface). Comparing critical undercoolings observed in the crystallization kinetics with experimental data on melt viscosity, atomistic simulation's data on liquid microstructure and theoretically predicted dendrite growth velocity allows us to conclude that the dendrite growth kinetics strongly depends on the cluster structure changes of the melt. The obtained data of theoretical and experimental investigations are used for interpretation of microstructure of samples processed in electro-magnetic levitator on board International Space Station in the frame of the project "MULTIPHAS" (European Space Agency and German Aerospace Center, 50WM1941) and "KINETIKA" (ROSKOSMOS).Keywords: dendrite, kinetics, model, solidification
Procedia PDF Downloads 12036918 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform
Authors: Reza Mohammadzadeh
Abstract:
The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.Keywords: data model, geotechnical risks, machine learning, underground coal mining
Procedia PDF Downloads 27436917 Alteration of Bone Strength in Osteoporosis of Mouse Femora: Computational Study Based on Micro CT Images
Authors: Changsoo Chon, Sangkuy Han, Donghyun Seo, Jihyung Park, Bokku Kang, Hansung Kim, Keyoungjin Chun, Cheolwoong Ko
Abstract:
The purpose of the study is to develop a finite element model based on 3D bone structural images of Micro-CT and to analyze the stress distribution for the osteoporosis mouse femora. In this study, results of finite element analysis show that the early osteoporosis of mouse model decreased a bone density in trabecular region; however, the bone density in cortical region increased.Keywords: micro-CT, finite element analysis, osteoporosis, bone strength
Procedia PDF Downloads 363