Search results for: Virtual prototyping (VP) models.
2375 MARTI and MRSD: Newly Developed Isolation-Damping Devices with Adaptive Hardening for Seismic Protection of Structures
Authors: Murat Dicleli, Ali Salem Milani
Abstract:
In this paper, a summary of analytical and experimental studies into the behavior of a new hysteretic damper, designed for seismic protection of structures is presented. The Multidirectional Torsional Hysteretic Damper (MRSD) is a patented invention in which a symmetrical arrangement of identical cylindrical steel cores is so configured as to yield in torsion while the structure experiences planar movements due to earthquake shakings. The new device has certain desirable properties. Notably, it is characterized by a variable and controllable-via-design post-elastic stiffness. The mentioned property is a result of MRSD’s kinematic configuration which produces this geometric hardening, rather than being a secondary large-displacement effect. Additionally, the new system is capable of reaching high force and displacement capacities, shows high levels of damping, and very stable cyclic response. The device has gone through many stages of design refinement, multiple prototype verification tests and development of design guide-lines and computer codes to facilitate its implementation in practice. Practicality of the new device, as offspring of an academic sphere, is assured through extensive collaboration with industry in its final design stages, prototyping and verification test programs.Keywords: Seismic, isolation, damper, adaptive stiffness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19982374 Daily Probability Model of Storm Events in Peninsular Malaysia
Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain
Abstract:
Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.
Keywords: Daily probability model, monsoon seasons, regions, storm events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16322373 Comparative Study of Bending Angle in Laser Forming Process Using Artificial Neural Network and Fuzzy Logic System
Authors: M. Hassani, Y. Hassani, N. Ajudanioskooei, N. N. Benvid
Abstract:
Laser Forming process as a non-contact thermal forming process is widely used to forming and bending of metallic and non-metallic sheets. In this process, according to laser irradiation along a specific path, sheet is bent. One of the most important output parameters in laser forming is bending angle that depends on process parameters such as physical and mechanical properties of materials, laser power, laser travel speed and the number of scan passes. In this paper, Artificial Neural Network and Fuzzy Logic System were used to predict of bending angle in laser forming process. Inputs to these models were laser travel speed and laser power. The comparison between artificial neural network and fuzzy logic models with experimental results has been shown both of these models have high ability to prediction of bending angles with minimum errors.
Keywords: Artificial neural network, bending angle, fuzzy logic, laser forming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9612372 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. UAV techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. In this paper, a methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of RGB and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.
Keywords: Aerial thermography, data processing, drone, low-cost, point cloud.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3412371 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets
Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille
Abstract:
3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.
Keywords: Color models, cultural heritage, laser scanner, photogrammetry, point cloud color.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16312370 Efficient Solution for a Class of Markov Chain Models of Tandem Queueing Networks
Authors: Chun Wen, Tingzhu Huang
Abstract:
We present a new numerical method for the computation of the steady-state solution of Markov chains. Theoretical analyses show that the proposed method, with a contraction factor α, converges to the one-dimensional null space of singular linear systems of the form Ax = 0. Numerical experiments are used to illustrate the effectiveness of the proposed method, with applications to a class of interesting models in the domain of tandem queueing networks.
Keywords: Markov chains, tandem queueing networks, convergence, effectiveness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13292369 Developing Pedotransfer Functions for Estimating Some Soil Properties using Artificial Neural Network and Multivariate Regression Approaches
Authors: Fereydoon Sarmadian, Ali Keshavarzi
Abstract:
Study of soil properties like field capacity (F.C.) and permanent wilting point (P.W.P.) play important roles in study of soil moisture retention curve. Although these parameters can be measured directly, their measurement is difficult and expensive. Pedotransfer functions (PTFs) provide an alternative by estimating soil parameters from more readily available soil data. In this investigation, 70 soil samples were collected from different horizons of 15 soil profiles located in the Ziaran region, Qazvin province, Iran. The data set was divided into two subsets for calibration (80%) and testing (20%) of the models and their normality were tested by Kolmogorov-Smirnov method. Both multivariate regression and artificial neural network (ANN) techniques were employed to develop the appropriate PTFs for predicting soil parameters using easily measurable characteristics of clay, silt, O.C, S.P, B.D and CaCO3. The performance of the multivariate regression and ANN models was evaluated using an independent test data set. In order to evaluate the models, root mean square error (RMSE) and R2 were used. The comparison of RSME for two mentioned models showed that the ANN model gives better estimates of F.C and P.W.P than the multivariate regression model. The value of RMSE and R2 derived by ANN model for F.C and P.W.P were (2.35, 0.77) and (2.83, 0.72), respectively. The corresponding values for multivariate regression model were (4.46, 0.68) and (5.21, 0.64), respectively. Results showed that ANN with five neurons in hidden layer had better performance in predicting soil properties than multivariate regression.
Keywords: Artificial neural network, Field capacity, Permanentwilting point, Pedotransfer functions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18192368 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models
Authors: Danielle Shackley, Yetunde Folajimi
Abstract:
As more people turn to the internet seeking health related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores of text, ranging from positive, neutral and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing, tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process, and substituting the Naive Bayes for a deep learning neural network model.
Keywords: Sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4872367 The COVID-19 Pandemic: Lessons Learned in Promoting Student Internationalisation
Authors: David Cobham
Abstract:
In higher education, a great degree of importance is placed on the internationalisation of the student experience. This is seen as a valuable contributor to elements such as building confidence, broadening knowledge, creating networks, and connections and enhancing employability for current students who will become the next generation of managers in technology and business. The COVID-19 pandemic has affected all areas of people’s lives. The limitations of travel coupled with the fears and concerns generated by the health risks have dramatically reduced the opportunity for students to engage with this agenda. Institutions of higher education have been required to rethink fundamental aspects of their business model from recruitment and enrolment, through learning approaches, assessment methods and the pathway to employment. This paper presents a case study which focuses on student mobility and how the physical experience of being in another country either to study, to work, to volunteer or to gain cultural and social enhancement has of necessity been replaced by alternative approaches. It considers trans-national education as an alternative to physical study overseas, virtual mobility and internships as an alternative to international work experience and adopting collaborative on-line projects as an alternative to in-person encounters. The paper concludes that although these elements have been adopted to address the current situation, the lessons learnt and the feedback gained suggests that they have contributed successfully in new and sometimes unexpected ways, and that they will persist beyond the present to become part of the "new normal" for the future. That being the case, senior leaders of institutions of higher education will be required to revisit their international plans and to rewrite their international strategies to take account of and build upon these changes.
Keywords: Trans-national education, internationalisation, higher education management, virtual mobility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9682366 Business Model Topology in Emerging Business Ecosystem
Authors: Olga Novikova, Timo Vuori
Abstract:
This paper describes topology of business models in market ecosystem of the emerging electric mobility industry. The business model topology shows that firm-s participation in the ecosystem is associated with different requirements on resources and capabilities, and different levels of risk. Business model concept is used together with concepts of networked value creation and shows that firms can achieve higher levels of sustainable advantage by cooperation, not competition. Hybrid business models provide companies a viable alternative possibility for participation in the market ecosystem.
Keywords: Business model, ecosystem, topology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26482365 Multiple Sensors and JPDA-IMM-UKF Algorithm for Tracking Multiple Maneuvering Targets
Authors: Wissem Saidani, Yacine Morsly, Mohand Saïd Djouadi
Abstract:
In this paper, we consider the problem of tracking multiple maneuvering targets using switching multiple target motion models. With this paper, we aim to contribute in solving the problem of model-based body motion estimation by using data coming from visual sensors. The Interacting Multiple Model (IMM) algorithm is specially designed to track accurately targets whose state and/or measurement (assumed to be linear) models changes during motion transition. However, when these models are nonlinear, the IMM algorithm must be modified in order to guarantee an accurate track. In this paper we propose to avoid the Extended Kalman filter because of its limitations and substitute it with the Unscented Kalman filter which seems to be more efficient especially according to the simulation results obtained with the nonlinear IMM algorithm (IMMUKF). To resolve the problem of data association, the JPDA approach is combined with the IMM-UKF algorithm, the derived algorithm is noted JPDA-IMM-UKF.Keywords: Estimation, Kalman filtering, Multi-Target Tracking, Visual servoing, data association.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25642364 A Graphical Environment for Petri Nets INA Tool Based on Meta-Modelling and Graph Grammars
Authors: Raida El Mansouri, Elhillali Kerkouche, Allaoua Chaoui
Abstract:
The Petri net tool INA is a well known tool by the Petri net community. However, it lacks a graphical environment to cerate and analyse INA models. Building a modelling tool for the design and analysis from scratch (for INA tool for example) is generally a prohibitive task. Meta-Modelling approach is useful to deal with such problems since it allows the modelling of the formalisms themselves. In this paper, we propose an approach based on the combined use of Meta-modelling and Graph Grammars to automatically generate a visual modelling tool for INA for analysis purposes. In our approach, the UML Class diagram formalism is used to define a meta-model of INA models. The meta-modelling tool ATOM3 is used to generate a visual modelling tool according to the proposed INA meta-model. We have also proposed a graph grammar to automatically generate INA description of the graphically specified Petri net models. This allows the user to avoid the errors when this description is done manually. Then the INA tool is used to perform the simulation and the analysis of the resulted INA description. Our environment is illustrated through an example.Keywords: INA, Meta-modelling, Graph Grammars, AToM3, Automatic Code Generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18672363 Modeling and Verification for the Micropayment Protocol Netpay
Authors: Kaylash Chaudhary, Ansgar Fehnker
Abstract:
There are many virtual payment systems available to conduct micropayments. It is essential that the protocols satisfy the highest standards of correctness. This paper examines the Netpay Protocol [3], provide its formalization as automata model, and prove two important correctness properties, namely absence of deadlock and validity of an ecoin during the execution of the protocol. This paper assumes a cooperative customer and will prove that the protocol is executing according to its description.Keywords: Model, Verification, Micropayment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13282362 Analysis of the Interference from Risk-Determining Factors of Cooperative and Conventional Construction Contracts
Authors: E. Harrer, M. Mauerhofer, T. Werginz
Abstract:
As a result of intensive competition, the building sector is suffering from a high degree of rivalry. Furthermore, there can be observed an unbalanced distribution of project risks. Clients are aimed to shift their own risks into the sphere of the constructors or planners. The consequence of this is that the number of conflicts between the involved parties is inordinately high or even increasing; an alternative approach to counter on that developments are cooperative project forms in the construction sector. This research compares conventional contract models and models with partnering agreements to examine the influence on project risks by an early integration of the involved parties. The goal is to show up deviations in different project stages from the design phase to the project transfer phase. These deviations are evaluated by a survey of experts from the three spheres: clients, contractors and planners. By rating the influence of the participants on specific risk factors it is possible to identify factors which are relevant for a smooth project execution.Keywords: Collaborative work, construction industry, contract-models, influence, partnering, project management, risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8692361 A Study on Removal of Toluidine Blue Dye from Aqueous Solution by Adsorption onto Neem Leaf Powder
Authors: Himanshu Patel, R. T. Vashi
Abstract:
Adsorption of Toluidine blue dye from aqueous solutions onto Neem Leaf Powder (NLP) has been investigated. The surface characterization of this natural material was examined by Particle size analysis, Scanning Electron Microscopy (SEM), Fourier Transform Infrared (FTIR) spectroscopy and X-Ray Diffraction (XRD). The effects of process parameters such as initial concentration, pH, temperature and contact duration on the adsorption capacities have been evaluated, in which pH has been found to be most effective parameter among all. The data were analyzed using the Langmuir and Freundlich for explaining the equilibrium characteristics of adsorption. And kinetic models like pseudo first- order, second-order model and Elovich equation were utilized to describe the kinetic data. The experimental data were well fitted with Langmuir adsorption isotherm model and pseudo second order kinetic model. The thermodynamic parameters, such as Free energy of adsorption (AG"), enthalpy change (AH') and entropy change (AS°) were also determined and evaluated.
Keywords: Adsorption, isotherm models, kinetic models, temperature, toluidine blue dye, surface chemistry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17972360 Validation of Reverse Engineered Web Application Models
Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini
Abstract:
Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16232359 Building a Service-Centric Business Model in SMEs in the Business-to-Business Context
Authors: Päivi J. Tossavainen , Leena Alakoski, Katri Ojasalo
Abstract:
Building a service-centric business model requires new knowledge and capabilities in companies. This paper enlightens the challenges small and medium sized firms (SMEs) face when developing their service-centric business models. This paper examines the premise for knowledge transfer and capability development required. The objective of this paper is to increase knowledge about SME-s transformation to service-centric business models.This paper reports an action research based case study. The paper provides empirical evidence from three case companies. The empirical data was collected through multiple methods. The findings of the paper are: First, the developed model to analyze the current state in companies. Second, the process of building the service – centric business models. Third, the selection of suitable service development methods. The lack of a holistic understanding on service logic suggests that SMEs need practical and easy to use methods to improve their businessKeywords: service-centric business model, service development, action research, case study
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17802358 Strategic Management Accounting: Implementation and Control
Authors: Alireza Azimi Sani
Abstract:
This paper discusses the design characteristics management accounting systems should have to be useful for strategic planning and control and provides brief introductions to strategic variance analysis, profit-linked performance measurement models and balanced scorecard. It shows two multi-period, multiproduct models are specified, can be related to Porter's strategy framework and cost and revenue drivers, and can be used to support strategic planning, control and cost management.
Keywords: Accounting, balanced scorecard, profit-linked, strategic management, variance analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50212357 Second Language Development with an Intercultural Approach: A Pilot Program Applied to Higher Education Students from a Escuela Normal in Atequiza, Mexico
Authors: Frida C. Jaime Franco, C. Paulina Navarro Núñez, R. Jacob Sánchez Nájera
Abstract:
The importance of developing multi-language abilities in our global society is noteworthy. However, the necessity, interest, and consciousness of the significance that the development of another language represents, apart from the mother tongue, is not always the same in all contexts as it is in multicultural communities, especially in rural higher education institutions immersed in small communities. Leading opportunities for digital interaction among learners from Mexico and abroad partners represents scaffolding towards, not only language skills development but also intercultural communicative competences (ICC). This study leads us to consider what should be the best approach to work while applying a program of ICC integrated into the practice of EFL. While analyzing the roots of the language, it is possible to obtain the main objective of learning another language, to communicate with a functional purpose, as well as attaching social practices to the learning process, giving a result of functionality and significance to the target language. Hence, the collateral impact that collaborative learning leads to, aims to contribute to a better global understanding as well as a means of self and other cultural awareness through intercultural communication. While communicating through the target language by online collaboration among students in platforms of long-distance communication, language is used as a tool of interaction to broaden students’ perspectives reaching a substantial improvement with the help of their differences. This process should consider the application of the target language in the inquiry of sociocultural information, expecting the learners to integrate communicative skills to handle cultural differentiation at the same time they apply the knowledge of their target language in a real scenario of communication, despite being through virtual resources.
Keywords: Collaborative learning, English as a Foreign language, intercultural communication, intercultural communicative competences, virtual partnership.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6732356 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece
Authors: Panagiotis Karadimos, Leonidas Anthopoulos
Abstract:
Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.
Keywords: Actual cost and duration, attribute selection, bridge projects, neural networks, predicting models, FANN TOOL, WEKA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12292355 Effect of Soil Corrosion in Failures of Buried Gas Pipelines
Authors: Saima Ali, Pathamanathan Rajeev, Imteaz A. Monzur
Abstract:
In this paper, a brief review of the corrosion mechanism in buried pipe and modes of failure is provided together with the available corrosion models. Moreover, the sensitivity analysis is performed to understand the influence of corrosion model parameters on the remaining life estimation. Further, the probabilistic analysis is performed to propagate the uncertainty in the corrosion model on the estimation of the renaming life of the pipe. Finally, the comparison among the corrosion models on the basis of the remaining life estimation will be provided to improve the renewal plan.
Keywords: Corrosion, pit depth, sensitivity analysis, exposure period.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17292354 The Digital Microscopy in Organ Transplantation: Ergonomics of the Tele-Pathological Evaluation of Renal, Liver and Pancreatic Grafts
Authors: C. S. Mammas, A. Lazaris, A. S. Mamma-Graham, G. Kostopanagiotou, C. Lemonidou, J. Mantas, E. Patsouris
Abstract:
Introduction: The process to build a better safety culture, methods of error analysis, and preventive measures, starts with an understanding of the effects when human factors engineering refer to remote microscopic diagnosis in surgery and specially in organ transplantation for the remote evaluation of the grafts. It has been estimated that even in well-organized transplant systems an average of 8% to 14% of the grafts (G) that arrive at the recipient hospitals may be considered as diseased, injured, damaged or improper for transplantation. Digital microscopy adds information on a microscopic level about the grafts in Organ Transplant (OT), and may lead to a change in their management. Such a method will reduce the possibility that a diseased G, will arrive at the recipient hospital for implantation. Aim: Ergonomics of Digital Microscopy (DM) based on virtual slides, on Telemedicine Systems (TS) for Tele-Pathological (TPE) evaluation of the grafts (G) in organ transplantation (OT). Material and Methods: By experimental simulation, the ergonomics of DM for microscopic TPE of Renal Graft (RG), Liver Graft (LG) and Pancreatic Graft (PG) tissues is analyzed. In fact, this corresponded to the ergonomics of digital microscopy for TPE in OT by applying Virtual Slide (VS) system for graft tissue image capture, for remote diagnoses of possible microscopic inflammatory and/or neoplastic lesions. Experimentation included: a. Development of an OTE-TS similar Experimental Telemedicine System (Exp.-TS), b. Simulation of the integration of TS with the VS based microscopic TPE of RG, LG and PG applying DM. Simulation of the DM based TPE was performed by 2 specialists on a total of 238 human Renal Graft (RG), 172 Liver Graft (LG) and 108 Pancreatic Graft (PG) tissues digital microscopic images for inflammatory and neoplastic lesions on four electronic spaces of the four used TS. Results: Statistical analysis of specialist‘s answers about the ability to diagnose accurately the diseased RG, LG and PG tissues on the electronic space among four TS (A,B,C,D) showed that DM on TS for TPE in OT is elaborated perfectly on the ES of a Desktop, followed by the ES of the applied Exp.-TS. Tablet and Mobile-Phone ES seem significantly risky for the application of DM in OT (p<.001). Conclusion: To make the largest reduction in errors and adverse events referring to the quality of the grafts, it will take application of human factors engineering to procurement, design, audit, and aware ness-raising activities. Consequently, it will take an investment in new training, people, and other changes to management activities for DM in OT. The simulating VS based TPE with DM of RG, LG and PG tissues after retrieval; seem feasible and reliable and dependable on the size of the electronic space of the applied TS, for remote prevention of diseased grafts from being retrieved and/or sent to the recipient hospital and for post-grafting and pre-transplant planning.Keywords: Organ Transplantation, Tele-Pathology, Digital Microscopy, Virtual Slides.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18982353 A Numerical Study on the Influence of CO2 Dilution on Combustion Characteristics of a Turbulent Diffusion Flame
Authors: Yasaman Tohidi, Rouzbeh Riazi, Shidvash Vakilipour, Masoud Mohammadi
Abstract:
The objective of the present study is to numerically investigate the effect of CO2 replacement of N2 in air stream on the flame characteristics of the CH4 turbulent diffusion flame. The Open source Field Operation and Manipulation (OpenFOAM) has been used as the computational tool. In this regard, laminar flamelet and modified k-ε models have been utilized as combustion and turbulence models, respectively. Results reveal that the presence of CO2 in air stream changes the flame shape and maximum flame temperature. Also, CO2 dilution causes an increment in CO mass fraction.Keywords: CH4 diffusion flame, CO2 dilution, OpenFOAM, turbulent flame.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7712352 Masonry CSEB Building Models under Shaketable Testing-An Experimental Study
Authors: Lakshmi Keshav, V. G. Srisanthi
Abstract:
In this experimental investigation shake table tests were conducted on two reduced models that represent normal single room building constructed by Compressed Stabilized Earth Block (CSEB) from locally available soil. One model was constructed with earthquake resisting features (EQRF) having sill band, lintel band and vertical bands to control the building vibration and another one was without Earthquake Resisting Features. To examine the seismic capacity of the models particularly when it is subjected to long-period ground motion by large amplitude by many cycles of repeated loading, the test specimen was shaken repeatedly until the failure. The test results from Hi-end Data Acquisition system show that model with EQRF behave better than without EQRF. This modified masonry model with new material combined with new bands is used to improve the behavior of masonry building.Keywords: Earth Quake Resisting Features, Compressed Stabilized Earth Blocks, Masonry structures, Shake table testing, Horizontal and vertical bands.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27352351 An Improved Prediction Model of Ozone Concentration Time Series Based On Chaotic Approach
Authors: N. Z. A. Hamid, M. S. M. Noorani
Abstract:
This study is focused on the development of prediction models of the Ozone concentration time series. Prediction model is built based on chaotic approach. Firstly, the chaotic nature of the time series is detected by means of phase space plot and the Cao method. Then, the prediction model is built and the local linear approximation method is used for the forecasting purposes. Traditional prediction of autoregressive linear model is also built. Moreover, an improvement in local linear approximation method is also performed. Prediction models are applied to the hourly Ozone time series observed at the benchmark station in Malaysia. Comparison of all models through the calculation of mean absolute error, root mean squared error and correlation coefficient shows that the one with improved prediction method is the best. Thus, chaotic approach is a good approach to be used to develop a prediction model for the Ozone concentration time series.
Keywords: Chaotic approach, phase space, Cao method, local linear approximation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17822350 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis
Authors: Petr Gurný
Abstract:
One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.
Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19192349 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.Keywords: Data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks, WSN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12202348 Meta Model Based EA for Complex Optimization
Authors: Maumita Bhattacharya
Abstract:
Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary algorithms in such problem domains is thus practically prohibitive. An attractive alternative is to build meta models or use an approximation of the actual fitness functions to be evaluated. These meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. Many regression and interpolation tools are available to build such meta models. This paper briefly discusses the architectures and use of such meta-modeling tools in an evolutionary optimization context. We further present two evolutionary algorithm frameworks which involve use of meta models for fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [14] reduces computation time by controlled use of meta-models (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account uncertain scenarios involving noisy fitness functions. The second model, DAFHEA-II, an enhanced version of the original DAFHEA framework, incorporates a multiple-model based learning approach for the support vector machine approximator to handle noisy functions [15]. Empirical results obtained by evaluating the frameworks using several benchmark functions demonstrate their efficiencyKeywords: Meta model, Evolutionary algorithm, Stochastictechnique, Fitness function, Optimization, Support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20672347 Application of Transportation Models for Analysing Future Intercity and Intracity Travel Patterns in Kuwait
Authors: Srikanth Pandurangi, Basheer Mohammed, Nezar Al Sayegh
Abstract:
In order to meet the increasing demand for housing care for Kuwaiti citizens, the government authorities in Kuwait are undertaking a series of projects in the form of new large cities, outside the current urban area. Al Mutlaa City located to the north-west of the Kuwait Metropolitan Area is one such project out of the 15 planned new cities. The city accommodates a wide variety of residential developments, employment opportunities, commercial, recreational, health care and institutional uses. This paper examines the application of comprehensive transportation demand modeling works undertaken in VISUM platform to understand the future intracity and intercity travel distribution patterns in Kuwait. The scope of models developed varied in levels of detail: strategic model update, sub-area models representing future demand of Al Mutlaa City, sub-area models built to estimate the demand in the residential neighborhoods of the city. This paper aims at offering model update framework that facilitates easy integration between sub-area models and strategic national models for unified traffic forecasts. This paper presents the transportation demand modeling results utilized in informing the planning of multi-modal transportation system for Al Mutlaa City. This paper also presents the household survey data collection efforts undertaken using GPS devices (first time in Kuwait) and notebook computer based digital survey forms for interviewing representative sample of citizens and residents. The survey results formed the basis of estimating trip generation rates and trip distribution coefficients used in the strategic base year model calibration and validation process.Keywords: GPS based household surveys, transportation infrastructure, origin-destination trip matrices, traffic forecasts, transportation demand modeling, travel behavior patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17072346 Performance Improvement in the Bivariate Models by using Modified Marginal Variance of Noisy Observations for Image-Denoising Applications
Authors: R. Senthilkumar
Abstract:
Most simple nonlinear thresholding rules for wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper compares different Shrinkage functions used for image-denoising. The second part of the paper compares different bivariate models and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill, Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values are calculated for each noise level.Keywords: BiShrink, Image-Denoising, PSNR, Shrinkage function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347